Coral is a SQL translation, analysis, and rewrite engine. It establishes a standard intermediate representation, Coral IR, which captures the semantics of relational algebraic expressions independently of any SQL dialect. Coral IR is defined in two forms: one is the at the abstract syntax tree (AST) layer, and the other is at the logical plan layer. Both forms are isomorphic and convertible to each other.
Coral exposes APIs for implementing conversions between SQL dialects and Coral IR in both directions. Currently, Coral supports converting HiveQL and Spark SQL to Coral IR, and converting Coral IR to HiveQL, Spark SQL, and Trino SQL. With multiple SQL dialects supported, Coral can be used to translate SQL statements and views defined in one dialect to equivalent ones in another dialect. It can also be used to interoperate between engines and SQL-powered data sources. For dialect conversion examples, see the modules coral-hive, coral-spark, and coral-trino.
Coral also exposes APIs for Coral IR rewrite and manipulation. This includes rewriting Coral IR expressions to produce semantically equivalent, but more performant expressions. For example, Coral automates incremental view maintenance by rewriting a view definition to an incremental one. See the module coral-incremental for more details. Other Coral rewrite applications include data governance and policy enforcement.
Coral can be used as a library in other projects, or as a service. See instructions below for more details.
- Join the discussion with the community on Slack here!
Coral consists of following modules:
- Coral-Hive: Parses HiveQL (and Spark SQL, which is largely Hive-compatible) into Coral IR.
- Coral-Trino: Generates Trino SQL from Coral IR. Parsing Trino SQL into Coral IR is in progress.
- Coral-Spark: Generates Spark SQL from Coral IR (typically also valid HiveQL).
- Coral-Dbt: Applies Coral transformations to dbt models.
- Coral-Incremental: Rewrites a query into an incremental form for view maintenance.
- Coral-Schema: Derives the Avro schema of a view from its logical plan and the Avro schemas of its base tables.
- Coral-Spark-Plan [WIP]: Converts Spark plan strings into an equivalent logical plan.
- Coral-Visualization: Renders Coral SqlNode and RelNode trees to an image.
- Coral-Service: Exposes Coral via REST APIs (see Coral-as-a-Service for more details).
This project adheres to semantic versioning, where the format x.y.z represents major, minor, and patch version upgrades. Consideration should be given to potential changes required when integrating different versions of this project.
Major version Upgrade
A major version upgrade represents a version change that introduces backward incompatibility by removal or renaming of classes.
Minor version Upgrade
A minor version upgrade represents a version change that introduces backward incompatibility by removal or renaming of methods.
Please carefully review the release notes and documentation accompanying each version upgrade to understand the specific changes and the recommended steps for migration.
Clone the repository:
git clone https://github.com/linkedin/coral.gitBuild:
Please note that this project requires Python 3 and Java 8 to run. Set JAVA_HOME to the home of an appropriate version and then use:
./gradlew clean buildor, set the org.gradle.java.home gradle property to the Java home of an appropriate version as below:
./gradlew -Dorg.gradle.java.home=/path/to/java/home clean buildThe project is under active development and we welcome contributions of different forms. Please see the Contribution Agreement.
- Coral: A SQL translation, analysis, and rewrite engine for modern data lakehouses, LinkedIn Engineering Blog, December 10, 2020.
- Transport: Towards Logical Independence Using Translatable Portable UDFs, LinkedIn Engineering Blog, November 14, 2018.
- Dali Views: Functions as a Service for Big Data, LinkedIn Engineering Blog, November 9, 2017.
- ViewShift: Dynamic Policy Enforcement for Every Data Lake, Databricks Data + AI Summit, June 10, 2025.
- Harnessing Coral and Iceberg for Advanced Incremental View Maintenance, Iceberg Summit, May 2024.
- Incremental View Maintenance with Coral, DBT, and Iceberg, Iceberg Meetup, May 11, 2023.
- Coral: A SQL translation and rewrite engine for modern data lakes, CDMS Workshop @ VLDB 2022, September 2022.
- Coral & Transport UDFs: Building Blocks of a Postmodern Data Warehouse, Facebook HQ, February 28, 2020.
- SQL Telemetry & Intelligence – How we built a Petabyte-scale Data Platform with Fabric, Raki Rahman, Microsoft Fabric Blog, December 16, 2025.
- How Uber Migrated from Hive to Spark SQL for ETL Workloads, Kumudini Kakwani, Suprit Acharya, Nimesh Khandelwal, Akshayaprakash Sharma, Chintan Betrabet, and Aayush Chaturvedi, Uber Engineering Blog, June 12, 2025.
- Data Guard: A Fine-grained Purpose-based Access Control System for Large Data Warehouses, Khai Tran et al., ICDE 2026 (arXiv:2502.01998), February 4, 2025.
- OpenIVM: a SQL-to-SQL Compiler for Incremental Computations, Ilaria Battiston, Kriti Kathuria, and Peter Boncz, SIGMOD 2024 (arXiv:2404.16486), April 25, 2024.
Coral-as-a-Service or simply, Coral Service is a service that exposes REST APIs that allow users to interact with Coral without necessarily coming from a compute engine. Currently, the service supports an API for query translation between different dialects and another for interacting with a local Hive Metastore to create example databases, tables, and views so they can be referenced in the translation API. The service can be used in two modes: remote Hive Metastore mode, and local Hive Metastore mode. The remote mode uses an existing (already deployed) Hive Metastore to resolve tables and views, while the local one creates an empty embedded Hive Metastore so users can add their own table and view definitions.
A POST API which takes JSON request body containing following parameters and returns the translated query:
sourceLanguage: Input dialect (e.g., spark, trino, hive -- see below for supported inputs)targetLanguage: Output dialect (e.g., spark, trino, hive -- see below for supported outputs)query: SQL query to translate between two dialects- [Optional]
rewriteType: Type of Coral IR rewrite (e.g, incremental)
A POST API which takes a SQL statement to create a database/table/view in the local metastore (note: this endpoint is only available with Coral Service in local metastore mode).
- Clone Coral repo
git clone https://github.com/linkedin/coral.git - From the root directory of Coral, access the coral-service module
cd coral-service - Build
../gradlew clean build - Run
../gradlew bootRun --args='--spring.profiles.active=localMetastore' - Add your kerberos client keytab file to
coral-service/src/main/resources - Appropriately replace all instances of
SET_MEincoral-service/src/main/resources/hive.properties - Run
../gradlew bootRun
You can also specify a custom location of hive.properties file through --hivePropsLocation as follows
./gradlew bootRun --args='--hivePropsLocation=/tmp/hive.properties'
Then you can interact with the service using your browser or the CLI.
After running ../gradlew bootRun --args='--spring.profiles.active=localMetastore' (for local metastore mode)
or ../gradlew bootRun (for remote metastore mode) from coral-service module, configure and start the UI.
Please note: The backend service runs on port 8080 (by default) and the web UI runs on port 3000 (by default).
- Create a
.env.localfile in the frontend project's root directory - Copy over the template from
.env.local.exampleinto the new.env.localfile - Fill in the environment variable values in
.env.local
npm installnpm run devOnce compiled, the UI can be accessed from the browser at http://localhost:3000.
The UI provides 3 features:This feature is only available with Coral Service in local metastore mode, it calls /api/catalog-ops/execute API above.
You can enter a SQL statement to create a database/table/view in the local metastore.
This feature is available with Coral Service in both local and remote metastore modes, it calls /api/translations/translate API above.
You can enter a SQL query and specify the source and target language to use Coral translation service. You can also specify the rewrite type to apply on the input query.
During translation, graphs of the Coral intermediate representations will also be generated and shown on screen. This will also include any post-rewrite nodes.
npm run lint:fix
npm run formatApart from the UI above, you can also interact with the service using the CLI.
Example workflow for local metastore mode:
- Create a database called
db1in local metastore using the/api/catalog-ops/executeendpoint
curl --header "Content-Type: application/json" \
--request POST \
--data "CREATE DATABASE IF NOT EXISTS db1" \
http://localhost:8080/api/catalog-ops/execute
Creation successful- Create a table called
airportwithindb1in local metastore using the/api/catalog-ops/executeendpoint
curl --header "Content-Type: application/json" \
--request POST \
--data "CREATE TABLE IF NOT EXISTS db1.airport(name string, country string, area_code int, code string, datepartition string)" \
http://localhost:8080/api/catalog-ops/execute
Creation successful- Translate a query on
db1.airportin local metastore using the/api/translations/translateendpoint
curl --header "Content-Type: application/json" \
--request POST \
--data '{
"sourceLanguage":"hive",
"targetLanguage":"trino",
"query":"SELECT * FROM db1.airport"
}' \
http://localhost:8080/api/translations/translateThe translation result is:
Original query in HiveQL:
SELECT * FROM db1.airport
Translated to Trino SQL:
SELECT "name", "country", "area_code", "code", "datepartition"
FROM "db1"."airport"
- Hive to Trino
- Hive to Spark
- Trino to Spark
Note: During Trino to Spark translations, views referenced in queries are considered to be defined in HiveQL and hence cannot be used when translating a view from Trino. Currently, only referencing base tables is supported in Trino queries. This translation path is currently a POC and may need further improvements. - Spark to Trino