Cognizant (NASDAQ-100: CTSH) is one of the world's leading professional services companies, transforming clients' business, operating and technology models for the digital era. Our unique industry-based, consultative approach helps clients envision, build and run more innovative and efficient businesses. Headquartered in the U.S., Cognizant is ranked 205 on the Fortune 500 and is consistently listed among the most admired companies in the world. Learn how Cognizant helps clients lead with digital at www.cognizant.com or follow us @Cognizant.

Cognizant is hiring for
Hadoop/Spark/Kafka, Mongo DB developer, Cassandra developer, Hadoop admin, Informatica MDM Developer, Stibo Developer, Informatica PIM Developer, SAP MDG Developer, Oracle CDH/PDH Developer, IBM MDM, Qliksense Developer, Qlikview Developer, Azure Developer, AWS Developer(Redshift)
with 3 - 10 years of experience

Walk-in Date: 08th June 2019

Walk-in for Bangalore

Cognizant Technology Solutions
65/2, Baghmane Tech Park,2nd floor C.V. Raman Nagar, Bangalore – 560093

Walk-in for Pune

Cognizant technology Solutions,
4th Floor, ICC Tech Tower, Landmark- Above Pantaloons, SB Road, Pune.

Walk-in for Chennai

Cognizant Technology Solutions,
3rd floor , Left wing 5/639, Old Mahabalipuram Road, Varalakshmi Tech Park, Kandanchavadi, Chennai- 600096 Landmark: Near Apollo Hospital.

Walk-in for Hyderabad

Cognizant Technology Solutions,
Building No 20., 8th floor, Raheja mind space, vittal Rao Nagar, Hi Tech City, madhapur-500039.

Position Job Summry Location Experience Spoc Person
Hadoop/Spark/Kafka
  • Basic understanding of Hadoop/Big data concepts is a must.
  • Candidate should have strong experience in Kafka,Spark with one scripting (either Scala or Java)
  • Should have at least 1 years of experience working on Hadoop eco system.
  • Should have exposure to some of the tools available as part of Hadoop eco system. (flume / Sqoop etc)
  • It is important that the candidate is able to talk about some of the best practices or performance tips which he / she is aware of and had used in their past exp.
  • Experience in Cloudera distribution will be an added advantage.
  • Should have good communication written / spoken. Expectation is member should be self-managed and able to provide effective status on individual work
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Dhinesh
Mongo DB developer
  • Working experience as a MongoDB Development is must.
  • In depth knowledge of modeling/architectural patterns and potential limitations within MongoDB.
  • Development experience building a model/framework to consume data from and push data into MongoDB (Aggregation and CRUD operations).
  • Specialized technical knowledge in any of the MongoDB APIs – Java/C#/Python/NodeJS.
  • Good troubleshooting skills.
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Dhinesh
Cassandra developer
  • A Scala Engineer building API’s with libraries like Play, Finagle, etc. Experience with these specific libraries is not a hard requirement. An experienced Scala developer can learn on the job.
  • A Big Data Engineer working with the Hortonworks Data Platform Stack with Ambari, Spark & Hadoop, but also Scala. The candidate must also know HDF (Apache NiFi).
  • A Fast Data Engineer with experience in Apache Flink, Kafka, Cassandra and Scala. An experienced Scala developer can build Apache Flink / Kafka expertise on the job
  • Knowledge and experience working with Data Security solutions like Ranger. The candidate must be well versed in customizing Ranger
  • SCRUM Agile development experience
  • Experience with CI tools such as Jenkins, Git (lab) etc.
  • Experience with SBT, Scala Test
  • Experience in Python and REST API would be an added advantage
  • Having good knowledge of using Linux
  • Experience and understanding how Docker works is a pre-requisite
  • A hands-on DevOps engineer, willing to learn on the go, displaying sense of ownership and commitment
  • Good social and communication skills
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Dhinesh
Hadoop admin
  • Working with business and application development teams to provide effective technical support
  • Assist / Support technical team members for automation, development, and security
  • Identifying the best solutions and conduction proofs of concepts leveraging Big Data & Advanced Analytics to meet functional and technical requirements
  • Interfacing with other groups such as security, network, compliance, storage, etc.
  • Transforming data from RDBMS to HDFS using available methodologies
  • Administration and testing of DR, replication & high availability solutions
  • Implementing Kerberos, Knox, Ranger, and other security enhancements
  • Managing and reviewing Hadoop log files for audit & retention
  • Arranging and managing maintenance windows to minimize impact of outages to end users
  • Moving services (redistribution) from one node to another to facilitate securing the cluster and ensuring high availability
  • Assist in reviewing and updating all configuration & documentation of Hadoop clusters as part of continuous improvement processes
  • Mentoring & supporting team members in managing Hadoop clusters
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Dhinesh
Informatica MDM Developer
  • Requirement gathering, Design, Data Modeling.
  • Integration with Source systems and Downstream applications.
  • MDM development in HUB console. Knowledge in complex cleanse functions, Security Access Manager, Match/merge rules set up, Trust enablement.
  • IDD development, Generating CO CS , Integrating with AVOS workflows, Java user exit.
  • Knowledge on Entity 360.
  • Installation experience, applying patchs/EBFs.
  • Knowledge in Power Center
Bangalore/Hyderabad/ Pune/Chennai/kolkata 3 - 10 Yrs Keerthiga
Stibo Developer
  • 4 to 6 years of overall experience
  • 2+ years of technical hands on experience in Stibo STEP
  • Strong knowledge of current STEP trailblazer releases such as 8.x and 9.x
  • Experience of interacting with customer, guiding customer to build STEP solution.
  • Experience in designing STEP HLD and LLD design document
  • Hands on experience in implementing MDM/PIM requirements in Stibo STEP
  • Good communications and articulation skills.
Bangalore/Hyderabad/ Pune/Chennai/kolkata 3 - 10 Yrs Keerthiga
Informatica PIM Developer
  • Technically strong PIM Informatica developer to lead Offshore team. The candidate should be capable of managing expectations of Client and Cognizant stakeholders.
  • The candidate is expected to have strong written and verbal communication skills.
Bangalore/Hyderabad/ Pune/Chennai/kolkata 3 - 10 Yrs Keerthiga
SAP MDG Developer
  • Should have worked on Implementation projects involving SAP MDG Solution for either of the following: Customer, Vendor, Material, Financial masters etc.
  • Strong knowledge of ABAP Object Oriented Programming, Master Data Governance (MDG), ABAP Floor Plan Manager (FPM), ABAP Web Dynpro, ABAP Workflow, Standard Data Model Enhancement, Custom Data Model Creation.
  • Good Knowledge on MDG UI configuration, FPM UI enhancement, context based adaptation, customizing, configuration, along with knowledge in WebDynpro and Floor Plan Manager
  • Should have experience in process modelling configuration (Business Activity, Change Request type, Workflow, Rule Based Workflow BRF+)
  • Ability to work on migration of existing master data into MDG hub, validations and derivations using BADIs and BRF+ Should have worked on data replication framework (DRF), data import using DIF, SOA service, ALE configuration, key & value mapping knowledge on integrating MDG with SAP Information Steward and SAP Data Services for Data Remediation and Data Validation respectively.
  • Knowledge on S/4 HANA Migration projects.
  • Experience in supporting UAT phase and go-live period of MDG Implementation project
Bangalore/Hyderabad/ Pune/Chennai/kolkata 3 - 10 Yrs Keerthiga
Oracle CDH/PDH Developer
  • IT experience, software development lifecycle, Enterprise RDBMS, Oracle MDM Concepts.
  • SQL/PLSQL, Java(J2EE), ADF and OAF programming experience is a must.
  • Experience is setting up new ADF environment in an EBS stack.
  • Experience in developing and maintaining business applications.
  • Working with JDBC data sources.
  • Background in MDM concepts and Oracle PDH
  • Design and configure data-driven workflows, and develop supporting business rules.
  • Experience in Oracle R12 or greater
  • Prepare documentation for code build MD70
  • Proficient in thinking critically and challenging norms
  • Experience with WebLogic server deployments and troubleshooting
  • Proficient in thinking critically and challenging norms
  • Strong verbal and written communications skills (English)
Bangalore/Hyderabad/ Pune/Chennai/kolkata 3 - 10 Yrs Keerthiga
IBM MDM
  • Drive design discussion on new functionalities like auto-merge/collapse.
  • Actively interact with business users, clients and different vendor partners on L2/L3 INCs/SRs.
  • Working with Application Integration team to onboard new Consumers of MDM services
  • Performance Tuning of bottlenecks in any transaction
  • Preparing data report to monitor efficiency of suspect identification rules.
  • Prepare and present weekly status reports, Monthly Metrics reports to client
  • Prepare release plans for quarterly/monthly releases and co-ordinate with Release management team
  • Prepare game plan for infra outages impacting prod/non-prod MDM instances
  • Analyze and troubleshoot incidents created on the applications
  • Strong technical analytical and debugging skills
  • Strong ITIL knowledge.
Bangalore/Hyderabad/ Pune/Chennai/kolkata 3 - 10 Yrs Keerthiga
Qliksense Developer
  • Implement various data modeling , visualization and reporting techniques
  • Creating qvd file, Transformation
  • Create different types of charts for QlikView Dashboard Analysis
  • Implementing set analysis in QlikView
  • Provide input on proposing, evaluating and selecting appropriate design alternatives which meet client requirements and are consistent with clients current standards and processes
  • Creating and maintain technical design documentation
  • Extracting, transforming and loading data from multiple sources into QlikView applications
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Karthika
Qlikview Developer
  • Have very good experience in Qlikview scripting , data modeling and visualization
  • Have good experience in database like SQL server etc
  • At least done two end to end project
  • Work as individual contributor
  • At least 3- 5 years of experience in Qlikview
  • Have good knowledge in Qlikview server and publisher
  • Proficient to develop the Qlikview dashboards
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Karthika
Azure Developer
  • Prior Azure development experience in atleast any three Azure services, as below
  • Azure Data Factory
  • Azure Data Lake Store
  • Azure Data Lake Analytics/USQL
  • Azure Data Warehouse / SQLDB
  • HD Insight
  • Azure Event Hub
  • Stream Analytics
  • Azure Analysis services
  • Hands-on with any programming/scripting skills is preferred (.Net, C# / Python / Javascript / R... any of these)
  • Experience in ETL development & deployment using SQL Server, SSIS is a plus
  • Profiles with rich customer facing experience will have higher preference
  • Should be able to work independently, as part of a distributed team across different geographies
  • Should have good communication skill to work with team members/client team at distant locations.
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Jeeva
AWS Developer(Redshift)
  • AWS Technical Essentials knowledge mandatory
  • AWS RDS, Aurora knowledge essential
  • Python scripting skill nice to have
  • SQL hands on expertise mandatory
  • AWS certification preferred
  • Working knowledge in Redshift Database and related monitoring
  • Conceptual knowledge about EC2, VPC and S3
Bangalore/Hyderabad/ Pune/Chennai 3 - 10 Yrs Jeeva