Get fresh updates from Hortonworks by email

Once a month, receive latest insights, trends, analytics, offering information and knowledge of the Big Data.


Erste Schritte


Sind Sie bereit?

Sandbox herunterladen

Wie können wir Ihnen helfen?

SchließenSchaltfläche „Schließen“

Maximize the Value of Data-in-Motion with Big Data from the Internet of Things


Cloud Sind Sie bereit?

HDF herunterladen
Hortonworks DataFlow (HDF™)

Hortonworks DataFlow (HDF)

Hortonworks DataFlow (HDF) provides the only end-to-end platform that collects, curates, analyzes and acts on data in real-time, on-premises or in the cloud, with a drag-and-drop visual interface. HDF is an integrated solution with Apache Nifi/MiNifi, Apache Kafka, Apache Storm and Druid.

The HDF streaming real-time data analytics platform includes Data Flow Management Systems, Stream Processing, and Enterprise Services.

Powering the Future of Data

HDF Data-In-Motion Platform

Three Major Components of Hortonworks DataFlow


Easy, Secure, and Reliable Way to Manage Data Flow 

Collect and manipulate internet of things big data flows securely and efficiently while giving real-time operational visibility, control, and management.


Immediate and Continuous Insights  

Build streaming analytics applications in minutes to capture perishable insights in real-time without writing a single line of code.

Mehr erfahren

Corporate Governance, Security and Operations 

Manage the HDF and HDP ecosystem with comprehensive management panel for provisioning, monitoring, and governance.

Mehr erfahren

Integrierte, datenquellenunabhängige Erfassungsplattform

HDF has full featured data collection capabilities that are streaming data agnostic and integrated with over 220 processors. Big Data from the internet of things can be collected from dynamic and distributed sources of differing formats, schemas, protocols, speeds and sizes and from types such as machines, geo location devices, click streams, files, social feeds, log files and videos.

Weitere Infos:

  • How real-time data-source agnostic dataflow management makes data movement easy
    Watch Video
    Learn More
    Learn what HDF can do to optimize log analytics from the Edge.Read More
Powerful Data Collection


With HDF, data collection is no longer a tedious process. You can manage data in full flight with a visual control panel to adjust sources, join and split streams, and prioritize data flow. HDF also can add contextual data to your streams for more complete analysis and insight. The always-on data provenance and audit trails provide security and governance compliance and troubleshooting as necessary in real-time. Integrated with Apache NiFi, MiNiFi, Kafka and Storm, HDF is ready for high volume event processing for immediate analysis and action. Kafka allows differing rates of data creation and delivery while Storm provides streaming real-time data analytics and immediate insights at a massive scale.

Weitere Infos:

  • So steigern Streaming-Daten, die über eine grafische Echtzeit-Benutzeroberfläche auf Basis von Apache NiFi gehandhabt werden, die operative Effektivität.
    Video ansehen
Real-Time Data Flow Management


HDF secures end-to-end data flow and routing from source to destination with discrete user authorization and detailed, real-time visual chain of custody. Use the visual user interface of HDF to encrypt streaming data, route it to Kafka, configure buffers and manage congestion so that data can be dynamically prioritized and securely sent. HDF enables role-based data access that allows enterprises to dynamically and securely share select pieces of pertinent data. HDF can easily deploy flow management and streaming applications in a Kerberized environment without much operational overhead.

Weitere Infos:

  • See how granular access of data is better than role based access
    Watch Video
Enterprise-Grade Security


HDF includes a complete streaming analytics module, Streaming Analytics Manager (SAM), to build streaming analytics applications that do event correlation, context enrichment, complex pattern matching, analytical aggregations and create alerts/notifications when insights are discovered. SAM makes building streaming analytics easy for application developers, DevOps and business analysts to build, develop, collaborate, analyze, deploy, and manage applications in minutes without writing a single line of code. Analysts use pre-built charts to quickly build analysis and create dashboards, while DevOps can manage and monitor the applications performance right out of the box.

Weitere Infos:


HDF includes, Schema Registry, a central schema repository that allows analytics applications to flexibly interact with each other. This enables users to save, edit, or retrieve schemas for the data they need. This also allows easy attachment of schemas to each data without incurring additional overhead for greater operational efficiency. With schema version management, data consumers and data producers can evolve at different rates. And, through schema validation, data quality is greatly improved. A central schema registry also provides for greater governance of how data is used. Schema Registry is integrated with Apache Nifi and HDF Streaming Analytics Manager.

Weitere Infos:


Build Real-Time Analytics Faster with Streaming Analytics Manager


Build analytics applications easily with drag and drop visual paradigm with drop down analytics functions


Analyze quickly with rich visual dashboard and an analytics engine powered by Druid


Operate efficiently with prebuilt monitoring dashboards of system metrics

Manage Data Flows More Easily with Schema Registry


Eliminate the need to code and attach schema to every piece of data and reduce operational overhead


Allow data consumers and producers to evolve at different rates with schema version management


Store schema for any type of entity or data store, not just Kafka


Holen Sie sich HDF-Versionshinweise, Handbücher für Benutzer und Entwickler sowie Anleitungen zu den ersten Schritten.


Der branchenweit beste Support für Apache NiFi, Kafka und Storm im Unternehmen. Lassen Sie sich von unserem Expertenteam auf Ihrer Reise begleiten.


Praxisschulungen von den Big-Data-Experten. Die Schulungen sind persönlich und bei Bedarf verfügbar, wann immer Sie uns brauchen.