Chief Big Data Architect (Hadoop, Cloudera, Solaris, Linux)
To £800 per day
Initial 6 months (extensions likely)
An experienced Chief Big Data Architect is required by a Global Consultancy to join their architecture team on a contract basis and work on one of the biggest re-platforming programmes of a data infrastructure in Europe.
As a Chief Architect you will be responsible for defining a Big Data architecture framework and a strategy for a migration from Solaris to Linux with a final deployment to Cloudera.
Skills & experience required:
Over 10 + years of professional IT experience, this includes experience in a variety of Big Data platforms.
3+Years of Experience as Hadoop admin/Architect on Cloudera and Hortonworks
Solaris to Linux migration experience
Hands on experience in installation, configuration, supporting and managing Hadoop Clusters using, Cloudera and Hortonworks.
Hands-on experience with Hadoop cluster Designing, Implementation, configuration, Administration, debugging, and performance tuning
Excellent understanding of Hadoop Cluster security and implemented secure Hadoop cluster using Kerberos, Knox and Ranger
Hands on experience in installing, configuring, and using Hadoop ecosystem components like HIVE, Zookeeper, Sqoop, Flume, HUE, Spark on Cloudera & Hortonworks Hadoop Distribution.
Maintain, support, monitor and upgrade all Hadoop environments including configuration, access control, capacity planning, permissions and security patches to ensure continuity to all Hadoop environments
Proficient with SDLC methodologies (Waterfall and Agile) .
OS : Solaris, Linux ( RedHat, CentOS), windows
RDBMS : Oracle 10g/11gR2/ 10g RAC / 11Gr2 RAC, MySQL
NoSQL : MongoDB
BigData : Apache Hadoop, Hive, Scoop, PIG, HBase, Flume, Spark, MapReduce, Zookeeper, Kylin
TOOLS : Toad, Erwin, OEM. Netbackup. BMC patrol, Autosys, Tomcat, Nagios and Ganglia.
HA Tools : Shareplex, Golden Gate
Programming : SQL, PL/SQL, HSQL, Unix Shell Scripting