Job Posted Date : April 9, 2018
The project is to build an enterprise Big Data Lake to integrate SCM, Marketing, Sales, Customer and other important data related globally and analyze it to understand the overall performance of the organization and to improve the business. It involves extracting data from various source systems and stores it in a single data lake after applying required business transformations. The module requires an expertise in Hadoop tools & technologies development using Sqoop, Pig, Hive, Hbase, NiFi, Kafka, AWS cloud and capability to research on various suitable tools. To perform these services.
- Gathering functional requirements from business and converting them to technical designs.
- Developing data pipeline using Big data Hadoop tools like Sqoop, Pig and Oozie to ingest T-Mobile customer’s data into HDFS for analysis.
- Developing complex HIVE queries to draw data patterns to improve reporting to business.
- Exporting data from HDFS to Teradata/Oracle database and vice-versa using SQOOP.
- Configured Hive metastore with MySQL, which stores the metadata for Hive tables.
- Automated the workflow using shell scripts.
- Mastering major Hadoop distros HDP/CDH and numerous Open Source projects.
- Involved in processing data from Local files, HDFS and RDBMS sources by creating RDD and optimizing for performance.
- Developing and optimizing RDD by using Cache & Persist functions.
- Have thorough knowledge on spark Sql and Hive context.
- Data processing using Spark & Scala programming language.
- Building code with like Jenkins, GIT and moving to production environment.
- Designed the ETL process and created the high level design document including the logical data flows, source data extraction process, the database staging and the extract creation, source archival, job scheduling and Error Handling.
- Involved in development, testing, migrating and L3 support.
- Analyzed the data by performing Hive queries and running Pig scripts to study customer behavior
- Preparing data refresh strategy document & Capacity planning documents required for project development and support.
- Prototype various applications that utilize modern Big Data tools.
- Participate in Daily Sprint Meetings; Work closely with analysts, designers and other project teams.
- Implement unit test cases for various modules and coordinated with QA team for production deployments/releases
Experience : 3+ years.
Qualification : Bachelor’s degree in Computer Applications, Computer Science, Electronics and Communication Engineering.
Skills : Hadoop Big data, NIFI, Kafka, Sqoop, Pig, Hive, Oracle, MySQL.
Location : Doral, Florida. * Relocation may be required.
Send Resume to : HR Dept., Iblesoft Inc.,7801 NW 37TH Street, Suite LP-104, Doral, FL 33195.
All employees of Iblesoft, Inc. are automatically enrolled in the employee referral program of the company. Referral fee of $1,000 will be paid if referred candidate is hired by the company.
|Skills||Hadoop Big data, NIFI, Kafka, Sqoop, Pig, Hive, Oracle, MySQL|
|No. of position||Multiple|
|Job Role||Hadoop Developer|