Newsletter

Holen Sie sich die neuesten Updates von Hortonworks per E-Mail

Einmal monatlich erhalten Sie die neuesten Erkenntnisse, Trends und Analysen sowie Fachwissen zu Big Data.

AVAILABLE NEWSLETTERS:

Sign up for the Developers Newsletter

Einmal monatlich erhalten Sie die neuesten Erkenntnisse, Trends und Analysen sowie Fachwissen zu Big Data.

cta

Erste Schritte

Cloud

Sind Sie bereit?

Sandbox herunterladen

Wie können wir Ihnen helfen?

* Ich habe verstanden, dass ich mich jederzeit abmelden kann. Außerdem akzeptiere ich die weiteren Angaben in der Datenschutzrichtlinie von Hortonworks.
SchließenSchaltfläche „Schließen“
October 03, 2018
Vorige FolieNächste Folie

How Johns Hopkins is Utilizing Apache Hadoop to Securely Access Log Events

Yesterday was Customer Experience Day, a day where we had the opportunity to celebrate the people and companies that make great customer experiences happen and recognize great customer work! As part of this celebration, we had a customer webinar with Johns Hopkins University.

Johns Hopkins University is an American private research university, founded in 1876 and located in Baltimore, Maryland. It is considered the first research university in the United States, and is organized into 10 divisions on campuses in Maryland and Washington, D.C. These divisions include the Johns Hopkins School of Medicine and the Applied Physics Laboratory, among various others.

In this webinar we heard from Conrad Fernandes, a Cloud and Cyber Security Engineer at APL. Fernandes is a long time cyber security engineer and architect, having worked with US Defense agencies and the DoD since the early 2000’s while at Booz Allen Hamilton. He currently serves as a senior cyber security engineer at the Johns Hopkins Applied Physics Laboratory (APL), where he leads security and governance practices on emerging cloud technologies, including commercial and US GovCloud (e.g., Amazon web services) and Hadoop-based data science platforms).

Fernandes talks about the strategies used to collect, audit, and access log events from key Apache Hadoop services and forwarding to a central server for monitoring, analysis, and response to a suspected breach. This mission critical for precision medicine and HIPAA sensitive data, and unclassified information (CUI) for defense projects. Johns Hopkins needed a platform that was both robust and secure for housing all of this data. The modern data architecture that was put into place includes Hortonworks DataFlow, and security and governance with Apache Atlas, Apache Ranger, and Apache Knox.

The university is now able to achieve results that would never be possible without a big data platform. Now data can be ingested from disparate sources, transported from various sources to the Hadoop cluster, and used to help clinicians administer more targeted treatments.

This webinar describes the real business value that Johns Hopkins has been gaining from its data, and how the lab was able to put these solutions into place.

Be sure to check out the webinar on-demand here.

For more about our customers, visit: https://hortonworks.com/customers/

 

Antwort verfassen

Ihre E-Mail-Adresse wird nicht veröffentlicht. Pflichtfelder sind mit * gekennzeichnet