Ability to send values from Pre-ingestion to Post Ingestion
Request for ability for transfer variables set in pre-ingestion to workflow in post ingestion. In some scenarios we need to set few variables in pre-ingestion which needs to set and to be used in post ingestion workflow.
Chawjyoti Buragohain
over 2 years ago
in Data Ingestion / Workflow
0
Future consideration
Support record-level insert, update, and delete on DFS using apache HUDI
Apache Hudi - HADOOP UPSERT AND INCREMENTAL is an open-source data management framework used to simplify incremental data processing and data pipeline development. Apache Hudi enables you to manage data at record level in DFS storages to simplify ...
Adi Bandaru
over 2 years ago
in Data Ingestion
0
Future consideration
Currently to trigger an ingestion manually, trigger suffix is being used. Along with trigger suffix if a REST endpoint can be provided where an user (or any third party tool) can hit a request to perform the ingestion for the particular file pattern.
Jyotiplaban Talukdar
over 2 years ago
in Data Ingestion
0
Future consideration
Clients Scenario: I have an entity with custom data format. This entity is associated with an EDQ action of a post ingestion WF. While executing the action, I got the exception "Entity with data file CUSTOM is not supported by EDQ"
As a data steward or member of governance team or subject matter expert I want to interactively work with the data to perform enrichment or fix bad data to update trusted zone data
Ability to compare data in trusted zone with source of truth data
As a data steward or member of governance team, I want the ability to compare data stored in trusted zone with golden record/source of truth data stored in source platform, so that I can ensure data quality and completeness.
Nikhil Goel
almost 3 years ago
in Data Quality
0
On the Backlog