Hadoop/Data Architect

Hadoop/Data Architect

We are looking to hire a Hadoop/Data Architect in our Product Development organization. A successful candidate will have strong analytical, technical, and communication skills and extensive hands-on experience designing and building Hadoop, Linux, and SQL Server applications. You must have proven experience designing, implementing, and supporting large-scale production Hadoop solutions leveraging the ecosystem software including HDFS, MapReduce, YARN, Hive, Sqoop, Flume, and Kafka. Hadoop certification preferred. Additionally, the ideal candidate will have experience in large-scale relational database design and administration.

This is an exciting opportunity to be part of a growing Cybersecurity company and to help develop and shape the initiatives and strategy of a growing Product Development team. Under the direction of the Product Director, this position will help define business requirements and establish technology direction to support Product Roadmaps and SPHERE vision.

If you are a self-starter who enjoys problem solving, collaborating on Agile teams and thrives in a fast-moving, ever-changing startup environment, then read on!

Essential Functions:

  • A HADOOP/DATA ARCHITECT that excels at designing and developing high quality products with intuitive interfaces and has the drive to ensure that the success criteria are met within budget and schedule constraints.
  • Design large-scale production Hadoop solutions.
  • Deploy new Hadoop environments and expand existing environments.
  • Data modeling, designing and implementation of Hive and Impala SQL tables.
  • Write data queries in the Hadoop environment using tools including Hive, Druid and Phoenix.
  • Assist Product group in defining new feature requirements.
  • Translate business requirements into technical requirements.
  • Design and implement technology solutions keeping in mind key architectural attributes (e.g. security, reliability, scalability, maintainability, etc.)
  • Provide guidance to the Development team to incorporate key architectural attributes into designs.
  • Work with peers to troubleshoot and resolve project issues.
  • Participate in project release efforts ranging in scope from small to very large.
  • Well versed in Coding, Scripting, Load Testing and Performance Tuning.
  • Develop positive relationships with business partners to better understand their needs, manage expectations, and deliver value-add solutions.
  • Collaborate with other teams (technical and non-technical) to ensure successful project delivery and adherence to SDLC processes.

Requirements:

  • Minimum Education B.S. in Computer Science, IT or related field.
  • 12+ years’ experience with product development and architecture of software, applications and services.
  • Hadoop/Data Architect (Linux, SQL), 5 years (Required)
  • HDFS, MapReduce, YARN, Hive, Sqoop, Flume, and Kafka
  • Demonstrated experience implementing design patterns commonly used in Hadoop-based deployments.
  • Strong Hadoop query coding and performance tuning skills.
  • Strong Relational database design and complex SQL coding and tuning skills.
  • Experience with designing and implementing mission critical systems that manage large scale data sets.
  • Experience with Java development, debugging & profiling.
  • Experience implementing software and/or solutions in the enterprise Linux environment.
  • Understanding of various enterprise security solutions such as LDAP, Kerebos and Active Directory.
  • Understanding of network configuration, devices, and protocols.
  • Understanding of SDLC and experience with Agile/SCRUM Methodology.
  • Prior experience in onshore-offshore development projects.
  • Excellent organizational and communication skills, both oral and written.
  • Ability to facilitate, influence and manage change to deliver solutions.

Additional Information/Skills:

  • Entrepreneurial drive demonstrated ability to achieve stretch goals in an innovative and fast paced environment. Able to fit in well within a startup environment.
  • Experience administering multi-node Hadoop clusters is a major plus.
  • Prior experience working for a financial services company is a plus.
  • Experience in the following technologies is a plus:
    • Linux Shell scripting and System Administration
    • Database Administration (SQL Server, Oracle or Sybase)
    • Cloud (Azure, AWS)
    • Storage platforms (Windows, NetApp, EMC Isilon, NAS, NFS)
    • Device connectors (Unix, Windows, SSCM, ServiceNow, CMDB, Active Directory)
    • 3’rd party interfaces (OneDrive, SharePoint, Microsoft System Center, Active Directory, PeopleSoft, Office 365)
    • T-SQL, PowerShell, Python, VueJS, JQuery, Visual Studio, Git, VSCode, SQL Server Management Studio, Jenkins
    • UI Frameworks (Vue, React, Angular, D3)

Apply for this position

Allowed Type(s): .pdf, .doc, .docx