Scenario #1: For AWS GovCloud, it is mandatory to supply the S3 region (or endpoint) to successfully access any S3 buckets. In an S3 connection, we can easily provide this information, and hence file ingestion using a BDCA agent is possible. Howev...
Hi, As a user I need an ability to change the connection associated to a Landing zone. Currently we only allow view and delete of the source directory. https://internal.docs.zaloni.com/6.2.0/ingestion/file_view/adding_a_source_directory.htm
When we are importing from an Oracle database the user is facing problems in type conversion. The user is importing columns having the Number datatype in Oracle. The expected behavior that they want while importing the columns is that: Number data...
Kavel Baruah
over 1 year ago
in Data Ingestion
0
On the Backlog
Hi, As a business user, I would like to ingest data from Cassandra DB into Hadoop datalake. Therefore, please create a connector to fetch data into HDFS using a functionality similar to DBIngestion
Ajinkya Rasam
over 1 year ago
in Data Ingestion
0
Future consideration
Customer is trying to use DB Import action to ingest data from views in Oracle Database. Expected Behavior: DB Import Action should support ingesting data from Oracle views along with registering Entity in the catalog.
Hi, In DB import action we have an option to import all tables in DB. When this option is selected ZDP creates SQOOP command for every single table and executes a MR job for each table. Meaning, if there are 1000 tables in db then ZDP will fire 10...
Ajinkya Rasam
about 2 years ago
in Data Ingestion
0
Will not implement
Ability to send values from Pre-ingestion to Post Ingestion
Request for ability for transfer variables set in pre-ingestion to workflow in post ingestion. In some scenarios we need to set few variables in pre-ingestion which needs to set and to be used in post ingestion workflow.
Chawjyoti Buragohain
about 2 years ago
in Data Ingestion / Workflow
0
Future consideration
Support record-level insert, update, and delete on DFS using apache HUDI
Apache Hudi - HADOOP UPSERT AND INCREMENTAL is an open-source data management framework used to simplify incremental data processing and data pipeline development. Apache Hudi enables you to manage data at record level in DFS storages to simplify ...
Adi Bandaru
over 2 years ago
in Data Ingestion
0
Future consideration