BIG DATA Application Developer & Product Trainer

Full time
Canada

Do it with Experts

Job Features

Job CategoryInformation Technology
Job Title :BIG DATA Application Developer & Product Trainer
Job location :Victoria, BC and Dartmouth, NS, Canada.
Job Reference Number :MGL7014
Employment Type :Full-time
Pay details :35 hours per week, Rate $32 per hour, Start date: December 2017

BIG DATA Application Developer & Product Trainer

Macropus Global Ltd, Victoria, BC, and Dartmouth, NS Canada.  https://www.macropusglobal.com | https://www.macropuseducation.com

Macropus Global Ltd (www.macropusglobal.com ) was founded in January 2009 with a vision of providing best services in the field of Information Technology Consulting Business Services. In early 2016 Macropus Global has expanded to IT educational (www.macropuseducation.com ) services for IT Training for Corporates & Individuals across the globe and placing Canadian individuals to the main stream Canadian workforce. Macropus Global has further expanded its wings into Canada Immigration & Visa Services.

We have an immediate need/opening of an expert in Big Data Application Developer for a fulltime in British Columbia and Nova Scotia. Experience in administration in multiple domains will be an added advantage. Join our team in a position that lets you grow with us!

Job Summary:

  • Train individuals and corporate customers on Big Data applications, basic set up and administration.
  • Hands on development role focused on creating big data and analytics solutions.
  • Commits or contribution via code or technical guidance to Apache Hadoop, Spark or related big data projects.
  • Development and management of bringing in various data sources into a global Data Lake.
  • Analyze business and functional requirements and contribute to overall solution.
  • Participate in design reviews, provide input to the design recommendations.
  • Participate in project planning sessions with project managers, business analysts and team members.
  • Translate complex functional and technical requirements.
  • Load disparate data sets using SQL, Apache Spark framework, Scala language, and Cloudera Hadoop.
  • Perform complex analysis, design, development, testing, and debugging of computer software.
  • Perform activities related to software design, operating architecture integration, and support in the selection of appropriate tools and technologies.
  • Create technical specifications, unit test plans, dataflow diagrams and logical data models.
  • Experience of handling data in various file types; flatfiles, XML, Parquet, data frames etc.
  • Experience with Git and MAVEN.
  • Follow quality assurance guidelines including the documentation, review and approval of all project related artifacts.
  • Architect, implement, and test data processing pipelines and data mining / data science algorithms on a variety of hosted settings such as AWS, Azure, client technology stacks.
  • Research, experiment, and utilize leading Big Data methodologies, such as Hadoop, Spark, Redshift, Netezza, SAP HANA, and Microsoft Azure.
  • Design and develop data structures and ETL Processes using, HIVE, HBase, PIG Scripts, Java functions, Spark with Scala.
  • Designing, building, installing, configuring, and supporting Hadoop.
  • Good exposure to columnar NoSQL DBs like HBase.
  • Candidate must have a deep understanding of logical and physical data modeling for OLTP and OLAP systems. Ability to translate a logical data model into a relational or non-relational solution as appropriate.
  • Monitor and analyze job performance, file system/disk space management, cluster & database connectivity and log files.
  • Data Ingestion using Kafka, Storm, Spark or Complex Event Processing (CEP).
  • Machine Learning integration within Spark using SparkML and SparkMLLib.
  • Hadoop integration points with enterprise BI and EDW tools.
  • Perform performance tuning, YARN and big data administration.
  • Data querying with hive, pig and impala.

Only candidates with above qualifications should apply!

Key Skills Required:

  • S./M.S in Computer Science, Mathematics, Engineering, or equivalent work experience required.
  • Big Data Hadoop and Spark Developer certification is a Big Plus.
  • Strong problem solving skill set, and the ability to understand new technologies quickly are essential.
  • Understanding of basic SQL commands necessary to perform administrative functions.
  • Strong experience with ELT, specifically around data wrangling and transformation.
  • Experience with software engineering in all aspects of the SDLC.
  • Experience with Apache Spark framework and Scala language.
  • Experience with Hadoop, Data transport, Spark, Hive, HDFS.
  • Familiarity with web, ftp, api, sql and related ETL technologies.
  • Knowledge of noSQL Databases like Cassandra and RDBMS like MySQL.
  • Experience with Java or J2EE/Scala/Python.
  • Experience with KAFKA is a plus.
  • Experience with SCRUM agile methodology process and development practices.
  • Experience in Unix/Linux shell scripting or similar programming/scripting knowledge.
  • Experience with developing and managing Web services, including REST or SOAP.
  • Experience with AWS is a plus.
  • Knowledge towards Machine learning algorithms is a big plus.

Qualifications:

  • Masters in Computer Engineering and/or Computer Science, or equivalent industry experience
  • At least 2 to 4 years Big Data Developer experience with mission critical systems in a medium to large environment.
  • Experience working with internal and external organizations solving complex problems and handling large scale projects.
  • Proven ability to drive systems optimizations on large scale infrastructure.

Assets:

  • Any Expert level Platform certifications (Both Hadoop and Spark).

We will check references with previous employers. If you are a qualified candidate, please email cover letter, pay requirements, work history, resume, and references. We will contact the most qualified candidates within the next month. Email responses only.

We are an "Equal Opportunity Employer" -- this job description / advertisement does not constitute an offer or guarantee of employment

Apply online only https://www.macropusglobal.com/career or HR via specified email jobs@macropusglobal.com . Shortlisted candidates shall be given a call.

 

 

Apply Online

A valid email address is required.
A valid phone number is required.