Hadoop Cluster Optimization Analysis and Maturity Assesment Training

  • Learn via: Classroom / Virtual Classroom / Online
  • Duration: 4 Days
  • Download PDF
  • We can host this training at your preferred location. Contact us!

Hadoop Data Platform  capabilities evolve continuously through the power of open community innovation. Meanwhile your business requirements are also changing at a fast pace, with more data applications and increased workloads being added to your cluster. As with any mission critical software platform, your cluster performance must be optimized to ensure it continues to meet your evolving requirements. A Hadoop Professional Services  Architect can help you identify opportunities not only to overcome the potential complexities introduced by the vast number of coniguration permutations available on the HDP platform, but also to help with the complex interplay of external factors impacting the cluster itself. The Architect will help lead this analysis and will build a custom blueprint for your administration team, providing an approach to improve the performance and stability of your environment.

Highlights

  • Leverage the knowledge and experience of Hadoop Professional Services to prepare your organization for the next phase of Apache™ Hadoop® adoption and maturation. 
  • Collaborate with the our team to create a blueprint to drive next steps around Hadoop in your enterprise. 
  • Improve your teams understanding of the impacts of certain external factors on cluster performance.

Our Goal

Our goal is to design a blueprint with you to help your team to improve the performance and stability of your Hadoop clusters, reducing your effort, and overcome some of the common challenges of advancing your Hadoop maturity

Main Points

Long term success around your Hadoop implementation can be negatively impacted by factors both internal and external to the cluster itself. From internal challenges with your cluster’s coniguration, due to local environmental issues, human error, or knowledge and expertise deiciencies derived from an inability to align with the pace of innovation, to external factors such as suboptimal development code and/or practices or enterprise environment misalignment, many variables impact your cluster’s ability to meet and/or exceed performance and overarching stability expectations. Having a blueprint to understand these immediate impacts while also considering upcoming requirements and expected SLAs of future use cases, will support the development of a strategy to overcome these challenges and drive toward operational excellence. 

What We Offer

The Hadoop Optimization Analysis and Blueprint helps your organization accelerate its ability to overcome some of these challenges via a collaborative analysis engagement with Hadoop platform. The engagement includes many facets including a current state assessment of several topics focusing on customer-identiied pain points, a Cluster Health Check to check whether “tkey” components are either working effectively or that your administration plan has an approach to resolve them, and a Use Case Assessment activity centered around a workshop to understand the upcoming opportunities for your organization around the Hadoop platform. Each of these activities will be incorporated into a set of documented indings and recommendations.

Why Our Services

The Hadoop Services team brings together the experience gained from hundreds of Hadoop implementations with customers around the globe and pairs it with collective knowledge and recommended practices. Our proven methodologies are derived from experience and close collaboration and access to expert peers, Product Management, Hortonworks Support and the Engineering teams who are at the heart of the development of the open source components.

Service Description


Project Steps 

  • Current State Assessment (CSA) 
  • Review Cluster Physical Topology 
  • Cursory hardware and network analysis in alignment with SLA requirements and use case roadmap 
  • Validation of OS/Network coniguration and setup (storage, “ulimits”, irewall, supported OS) 
  • Data organization within HDFS (e.g. staging concept) Capacity, sizing, and growth strategy 
  • HDP architecture and design assessment 
  • Backup and DR requirements and strategies 
  • Review of security model 
  • Review HA Coniguration 
  • Assess current and recommended resource management (e.g. scheduler, queues) 
  • Assess cluster utilization and utilization patterns 
  • Evaluate current state platform and data architectures against use case blueprint 
  • Evaluate current state blueprint against HDP product strategy 
  • Draft future state recommendations 

Cluster Health Check (CHC) 

  • Hadoop deployment and coniguration assessment 
  • Platform security implementation assessment 
  • Data ingestion process and coniguration analysis 
  • Data lifecycle assessment Draft cluster recommendations 
  • Benchmark current cluster performance proile 

Use Case Documentation (UCD) 

  • Conduct workshop to: 
  • Capture use cases 
  • Assess and prioritize use cases 
  • Create use case inventory and blueprint 

Deliverables 

  • Cluster Assessment and Recommendations Blueprint
  • Use Case Inventory Document Key Assumptions 

Customer Tasks and Dependencies 

  • Customer will deliver known issues log prior to engagement 
  • Customer will provide direct access to cluster to all Hadoop Architects Recommended 


Customer Team 

  • Executive Sponsor 
  • Project Manager/Technical Lead/Decision Maker 
  • Hadoop Administrator 
  • Network Administrator 
  • Key Technologists (dependent on in
  • -scope CSA activities) 

Out of Scope 

  • Deep dive into speciic topics not explicitly deined in a Statement of Work 
  • Resolution of any discovered issues or recommendations 
  • Use Case Ideation Workshops or Use Case capture beyond what is already known 
  • Any 3rd party tool analysis or assessment



Contact us for more detail about our trainings and for all other enquiries!