Quick Search
Advanced Search
Hadoop Developer
Computer Power Group Inc
Job ID VAC 160
Job Type Contract Basis
Industry/Sector IT Software, Software Services
Functional Area IT
Job Role Sr. Developer
Job Location Foster City, CA
Experience 10 - 15 Years
Annual Salary INR 0, Yearly
No.of Vacancies 1

Application will redirect you to company website
Job Description


Hadoop Developer and also, a Hadoop Solutions Architect. Both are onsite positions (no telecommuting).

For the Hadoop Developer, the max rate C2C is $90!!

For the Hadoop Lead (Solutions Architect), the max rate C2C is $100 C2C!!

These are very high profile positions within VISA, and the project is very ground floor and it deals with security analytics.
Also, if the 1st round interview goes well, the hiring manager will want to meet the consultant in person for the 2nd round interview (face-to-face) so please keep that in mind.  If he/she is offered and accepts the position, our company will reimburse the consultant 50% of the roundtrip airline ticket for that 2nd round interview.  Look forward to working with you on this req :)

Must have skills:
-Developer Candidate experienced programming with Hadoop
-10 to 15 years of experience working with Data Warehouses (for Hadoop Solutions Architect role)

-7 to 10 years of experience working with Data Warehouses (for Hadoop Developer role)

-Experience programming large Data Base

-Must have good communication skills
Nice to have experience:
-SAS or R

-EMC Green Plum

-Cloudera platform
Roles and responsibilities:
-Development work on a security analytics project
Preference on industry background or previous companies worked for:
-Financial services, banking or card processing would be ideal but not required
Interview timeline and process:
-Phone interview possibly followed by in person interview



Hadoop Solutions Architect (listed below) but there is also a Hadoop Developer position available

Primary responsibilities:

1. Work with business users, technology office and information services to ensure technology is implemented in such a way as to meet and exceed the expectations of a sophisticated and demanding business data mining community.
 2. Evolve the Data Platform architecture, data, compute, and access paradigms to meet and exceed business user expectations.
3. Support and advise business users on use of the Data Platform to solve. Business/Analytical problems.
 4. Solid understanding of all phases of development using multiple methodologies i.e. Waterfall, Agile
5. Manage technical staff as required
6. Conduct code/architecture reviews as required
7. Define and support process for access to the Data Platform including balancing the needs of various business groups with security requirements, and Platform capacity.
 8. Work with Architecture and Development teams to understand usage patterns and work load requirements of new projects in order to ensure the Data Platform can meet demand.
9. Maintain all system configuration documentation by collecting, storing, and updating the documentation;
 10. Develop administrator processes, document the processes, and then train other client users, as warranted, to take over the processes;



A minimum of 12 years of technical experience with a focus on open source and large data implementations in the Petabyte range. Business acumen and technical expertise are both required for the role. Experience developing solutions for Banking, Fraud, Risk or Marketing groups desired.
 Proven problem solving skills and an ability to respond resourcefully to new demands, priorities and challenges
Demonstrated ability to deliver with high quality and attention to detail
Excellent verbal, written and listening communication skills.
 Demonstrated ability to influence and develop relationships ; collaboration skills
Proven ability to manage teams.
Bachelor degree in a technical field such as computer science, computer engineering or related field required. MBA or other related advanced degree preferred.

1. Proven Linux experience including;
Basic Administration
Files and Permissions
Directory Navigation
 Job Scheduling
Shell Scripts
2. Hadoop Distributed Data Files System experience including
Use and set up of Blocks, Namenodes, Datanodes
File Systems Interfaces, parallel copies, cluster balance and archiving
 Scaling Out including data flow, combiner functions, running distributed jobs
Hadoop Streaming with Python
Hadoop Pipes
Sorting, joins and side data distributions
3. Experience with Hive including
Familiarity with Hive Query Language
 Plug Ins , Interfaces , User Defined Functions, SerDes
Basic Operators and Functions, Web Interface and Hive Client
4. General Knowledge of Relation Databases and BI tools
i.e. Microstrategy, Tableau, Greenplum, DB2, Oracle
 5. Familiarity with Analytic Environments
i.e. SAS, SPSS, R
6. Working knowledge of infrastructure required for Hadoop/Analytics environments
Capacity management and forecasting
7. Desired but not required experience


Nagaraju .Akkiraju (Naga)
Reach me: 262-432-2543
Manager - Head Staffing
Computer Power Group Inc

Application will redirect you to company website