Loome Connection
What is Apache Hadoop (HDFS)?
Apache Hadoop Distributed File System (HDFS) is an open-source file system for high-bandwidth data storage for the larger Hadoop framework. It is scalable, portable, distributed and provides the capability to run Java API and shell commands. It is best suited for batch processing of large volumes of data in parallel.
Extract Data From HDFS
Loome makes it simple to connect to Apache Hadoop and extract data for downstream systems such as an Integration Hub, Reporting Data Store, Data Lake or Enterprise Data Warehouse. In-built features allow bulk selection of all source tables/files to be automatically synced on a regular schedule, minimising data load size leveraging incremental logic.
Natively Orchestrate HDFS Integration Tasks
Loome allows orchestration of data pipelines across data engineering, data science and high performance computing workloads with native integration of Apache Hadoop data pipeline tasks.
Loome provides a sophisticated workbench for configuration of job and task dependencies, scheduling, detailed logging, automated notifications and API access for dynamic task creation and execution.
Loome can execute tasks located as scripts in a GIT repository, entered via a web interface or by executing operations within a database. Loome includes support for native execution of SQL, Python, Spark, HIVE, PowerShell/PowerShell Core and Operating System commands.
Loome also simplifies control of deployment across multiple environments, and approval of changes between Development, Test and Production environments. Loome also allows you to scale your advanced pipelines to take advantage of on-demand clusters without changing a single line of code.
Related Articles
Article
What are the Must-Have Attributes of a Modern Data Warehouse?
Modern data warehouse concepts you should consider before building a data platform for your enterprise.
Article
ETL vs ELT Pipelines in Modern Data Platforms
What is the best choice transform data in your enterprise data platform?
Article
Why Data Lake Architecture is not a Silver Bullet for Analytics
Understanding the definition of a data lake is the first step to finding the right storage and analytics solution.
Article
Managing Data Governance
Streamlining access to data resources and improving security, organisation-wide
Article
What is a Data Catalogue
A data catalogue is the best solution for managing all of your different data elements, helping to build good organisational data governance.