Skip to Main Content
Zaloni Ideas
ADD A NEW IDEA

FILTER BY CATEGORY

Data Ingestion

Showing 32 of 451

Support record-level insert, update, and delete on DFS using apache HUDI

Apache Hudi - HADOOP UPSERT AND INCREMENTAL is an open-source data management framework used to simplify incremental data processing and data pipeline development. Apache Hudi enables you to manage data at record level in DFS storages to simplify ...
Adi Bandaru over 2 years ago in Data Ingestion 0 Future consideration

Ingestion Trigger

Currently to trigger an ingestion manually, trigger suffix is being used. Along with trigger suffix if a REST endpoint can be provided where an user (or any third party tool) can hit a request to perform the ingestion for the particular file pattern.
Jyotiplaban Talukdar over 2 years ago in Data Ingestion 0 Future consideration

Ingesting Excel file format using wizard and manual options

Ingesting Excel file format using wizard and manual options
Nikhil Goel almost 3 years ago in Data Ingestion 0 On the Backlog

Ingestion History Enhancements (File Move)

Ingestion history captured by File Move Action does not capture all attributes needed by customer. The necessary attributes are source file name, file size, file type & associated entity. We should revisit and verify the completeness of teleme...
Sabyasachi Gupta almost 3 years ago in Data Ingestion 0 On the Backlog

Assistance with Destination Path in File Pattern

1. As a data miner that isresponsible for bringing data in the the ZDP from a variety of sources, but especially from files, 2. I want the ZDP system to assist me in assuring that the Destination Path specified in the File Pattern matches the Loca...
Nikhil Goel about 3 years ago in Data Ingestion / Metadata 0 On the Backlog

DB import considers table name as case sensitive although sqoop import is table name case insensitive

DbImport: Teradata is case insensitive therefore tableA or TABLEA are same. In Zaloni would show success but entity won’t be created. Logs would say table not found
Nikhil Goel about 3 years ago in Data Ingestion / Workflow 0 On the Backlog

With reference to 9154: Teradata char /varchar changed to string by default in hive using db import

DB Import action is being used for importing tables from Teradata. Sqoop, which is the internal implementation of DB Import action, converts the datatype VARCHAR to the corresponding Hive type STRING and hence the corresponding ZDP entity field is...
Nikhil Goel about 3 years ago in Data Ingestion / Metadata 0 Changed to Defect

Ability to start BDCA agents without Ingestion Warden

Starting BDCA agents from warden or another process poses a problem with switching to a user who is intended to own the BDCA process.
Nikhil Goel over 3 years ago in Data Ingestion 0 Future consideration

Start BDCA agent directly with service id

Customer wants to remove the middle man (eg. ingestion warden) in way that BDCA agents can be spawned directly on landing zone server and joins the ingestion cluster. File patterns and other ingestion definition etc. can be further defined by the ...
Nikhil Goel over 3 years ago in Data Ingestion 0 Future consideration

JSON files that are an array improvement

JSON files that are an array of multiple json files inside an array. The file can get ingested, but the file preview breaks.
Deleted User over 3 years ago in Data Ingestion 0 On the Backlog