Skip to Main Content
Zaloni Ideas


Data Ingestion

Showing 32 of 451

Ability to specify connection in File Wizard

Scenario #1: For AWS GovCloud, it is mandatory to supply the S3 region (or endpoint) to successfully access any S3 buckets. In an S3 connection, we can easily provide this information, and hence file ingestion using a BDCA agent is possible. Howev...
Mithulesh Kumar Medhi 7 months ago in Data Ingestion / Catalog / User Interface 0

Ability to ingest VSAM data

VSAM format - Ability to process any VSAM format based on configuration
Guest 8 months ago in Data Ingestion 0 Will not implement

Abillity to update connection for LZ

Hi, As a user I need an ability to change the connection associated to a Landing zone. Currently we only allow view and delete of the source directory.
Ajinkya Rasam about 1 year ago in Data Ingestion 0

Data type conversion for DB Import

When we are importing from an Oracle database the user is facing problems in type conversion. The user is importing columns having the Number datatype in Oracle. The expected behavior that they want while importing the columns is that: Number data...
Kavel Baruah over 1 year ago in Data Ingestion 0 On the Backlog

Arena support for Cassandra nosql db

Hi, As a business user, I would like to ingest data from Cassandra DB into Hadoop datalake. Therefore, please create a connector to fetch data into HDFS using a functionality similar to DBIngestion
Ajinkya Rasam over 1 year ago in Data Ingestion 0 Future consideration

DB Import Action to support Oracle Views

Customer is trying to use DB Import action to ingest data from views in Oracle Database. Expected Behavior: DB Import Action should support ingesting data from Oracle views along with registering Entity in the catalog.
Sunilam Chakraborty almost 2 years ago in Data Ingestion / Workflow 0 On the Roadmap

Improve DB Import design

Hi, In DB import action we have an option to import all tables in DB. When this option is selected ZDP creates SQOOP command for every single table and executes a MR job for each table. Meaning, if there are 1000 tables in db then ZDP will fire 10...
Ajinkya Rasam about 2 years ago in Data Ingestion 0 Will not implement

Ability to send values from Pre-ingestion to Post Ingestion

Request for ability for transfer variables set in pre-ingestion to workflow in post ingestion. In some scenarios we need to set few variables in pre-ingestion which needs to set and to be used in post ingestion workflow.
Chawjyoti Buragohain about 2 years ago in Data Ingestion / Workflow 0 Future consideration

Ability to use Spark Engine for File ingestion

Ability to select Spark Engine for File data ingestion and import (BDCA)
Sanjay Yadav about 2 years ago in Data Ingestion 1 On the Backlog

Support record-level insert, update, and delete on DFS using apache HUDI

Apache Hudi - HADOOP UPSERT AND INCREMENTAL is an open-source data management framework used to simplify incremental data processing and data pipeline development. Apache Hudi enables you to manage data at record level in DFS storages to simplify ...
Adi Bandaru over 2 years ago in Data Ingestion 0 Future consideration