Scenario #1:For AWS GovCloud, it is mandatory to supply the S3 region (or endpoint) to successfully access any S3 buckets.In an S3 connection, we can easily provide this information, and hence file ingestion using a BDCA agent is possible.However,...
Ability to parse multi level complex json file in ZDP
Current json serde unable to parse multi level json. can we add openx serde which is available as open source. details: open source link : https://github.com/rcongiu/Hive-JSON-Serdejar files link: http://www.congiu.net/hive-json-serde/1.3.8/cdh5/
Need ability to have automated schema mapping of data being ingested
PS has implemented a mechanism of schema mapping of the data being ingested. This allows columns to be not fixed within a data-file and it gets assigned to the right position within the Hive table at run-time. It uses column-headers to determine t...
over 2 years ago
in Data Ingestion
Display ingestion history for db wizard, db import created entities in entity view ingestion history tab when Display is 'Ingested File Size Per Day'
Display ingestion history for db wizard, db import created entities in entity view ingestion history tab when Display is 'Ingested File Size Per Day'Current behavior:The 'Ingested File Size Per Day' is shown only for entities associated with file ...
Customer would like to have details on rows counts ingested from files. Today, ZDP 5.0.2 displays the File Size per Day and File Count per Day. User would like to validate that these match in both a visual and cumulative report. Suggestion: Prov...
Customer is trying to use DB Import action to ingest data from views in Oracle Database.Expected Behavior:DB Import Action should support ingesting data from Oracle views along with registering Entity in the catalog.
Hi,As a user I need an ability to change the connection associated to a Landing zone.Currently we only allow view and delete of the source directory. https://internal.docs.zaloni.com/6.2.0/ingestion/file_view/adding_a_source_directory.htm
Hi,In DB import action we have an option to import all tables in DB. When this option is selected ZDP creates SQOOP command for every single table and executes a MR job for each table. Meaning, if there are 1000 tables in db then ZDP will fire 100...
almost 2 years ago
in Data Ingestion
Will not implement