Hello ,
Hope you're doing great...!!
This is Amit Kumar from Damco Group.
Hope you are doing well...
Please share me resume at amitc1@damcogroup.com
Need Visa Copy For submission
Position: ETL Architect
Location: Atlanta, GA
Duration: Long Term
Job Description:
JOB DESCRIPTION
"Data/ETL Architect" [Analytics]
SUMMARY: The Data/ETL Architect will lead the definition & implementation of the data warehouse
solution to support Ventiv's Analytics Platform across Claims, Risk and Safety.
The Data Architect owns data architecture design, definition and maintains the data models and provides
oversight and checkpoints to enforce architecture standards and quality of solutions. The Data Architect is
responsible for collaborating with Ventiv's internal business and technology partners and other stakeholders
to ensure the correct tools, technology & architecture is in place to deliver timely and accurate information to
support all Analytics needs. The DA reviews all designs to ensure that project teams are meeting expectations
for quality and conformity.
DUTIES AND RESPONSIBILITIES:
• Coordinate and lead meetings to create & review the design of all data models, data integration and
analytics delivery. This will include high-level design of the overall solution and detailed design of
components as needed (Database, ETL, Data models, Semantic layer, etc.).
• Responsible for delivering highly scalable, high performing, robust data architecture to support high
volumes of structured & unstructured data-sets.
• Own end-to-end data flow & ETL architecture & design robust & fault tolerant ETL processes with
auditing, monitoring capabilities to meet business needs (batch, real-time, incremental, etc.) as
needed.
• Provide leadership and guidance to project teams, developers and other architects
• Provide guidance and leadership on specific delivery methodologies for projects, such as technical
requirements, design patterns, code reviews and testing procedures
• Set strategy and oversee design for significant data modeling work, such as Logical Models, Physical
Models, DDL/schema generation, Conformed Dimensions, Enterprise Hierarchy, etc.
• Lead efforts to define/refine execution standards for all Data/Analytics layers (ETL, data modeling,
Cognos, Data science delivery, Platform etc.).
• Proven capability to deliver descriptive & diagnostics analytics as well as other advanced capabilities
such as predictive, prescriptive analytics, data mining & discovery, etc.
• Regularly interact with leadership on project work status, priority setting and resource allocations.
• Research new tools and/or new architecture and review with project teams as applicable
• Work with support team to define methods for implementing solutions for performance
measurement and monitoring
• Assist infrastructure leads as needed with background and information on all technologies in use for
projects such as new version upgrades, migration of hardware, production issues, etc.
• Provide leadership and guidance on setting up environments used by the project team.
• Participate in business & functional requirements gathering sessions / workshops.
o Translating business requirements into specifications for reporting, data, models and &
communicating effectively with technical, business resources.
o Perform data analysis, profiling & validation activities as needed.
o Translating technical and architectural complexities so that the stakeholders can easily
understand issues and challenges.
o Mapping of client custom data conversions, validate data, and perform issues resolution
with client (as needed)
o Perform data analysis & data validation activities as needed. Manage defect resolutions.
• Provide expert advice and assistance in the identification and diagnosis of problems during the
Implementation.
• Ensure the solution meets the specified requirements.
• Create and maintain documentation such as standards, models, data dictionary, etc.
• Active participant in SDLC process (Agile & waterfall methodologies)
• Other duties as assigned
EDUCATION
• Bachelor of Arts or Bachelor of Sciences or similar.
Skills / Experience Required
• 10+ years as a Data/ETL architect with large scale data architecture, design & modeling with data
warehousing & big data implementations.
• 3+ years of experience delivering highly scalable, high performing, robust data architecture to
support high volumes of structured & unstructured data-sets.
• 2+ years in the either insurance, safety or healthcare industry.
• Expert in various Data modeling techniques – Star Schema, ODS, 3NF, Data Vault, etc…
• 5+ years in implementing Analytics projects [descriptive, diagnostic, predictive, prescriptive analytics]
• Must have experience with both waterfall & agile methodologies.
• Ability to handle multiple concurrent priorities and work effectively under pressure in a fast paced
environment to meet deadlines & deliver quality product.
• Strong collaboration & team work skills. Ability to interact at all levels with all stakeholders including
business & technology. Must have good conflict management skills.
• Specific software skills
Ø Erwin, ERStudio, TOAD or similar data modeling & data management tools.
Ø Relational DBMS such as PostgreSQL, Oracle (preferred), etc..
Ø Experience with leading ETL tools, such as Talend (preferred) or Informatica
Ø Experience with leading OLAP/ROLAP tools (Business Objects, Microsoft, Microstrategy or
Cognos – preferred)
Ø Experience with data visualization tools such as Tableau.
Ø Experience with predictive & statistical modeling tools such as SAS, SPSS, R, etc…
Ø Experience with other web-based technologies such as CSS, HTML, XML and JavaScript, SOA
Web Services.
Ø JIRA.
• Non-Technical skills
Ø Demonstrated experience working on strategy projects in the Analytics, BI, Data Warehouse
and/or Data Management space
Ø Strong experience in both Agile and Waterfall methodologies.
Ø Must have strong analytical, diagnostics and creative problem solving skills.
Ø Must have strong process skills and ability to identify inefficiencies.
Ø Must be capable of working well under pressure situations independently or on a team.
Ø Must be a self-starter with good organizational and analytical skills with solid ability to work
in a team-oriented environment.
Ø Excellent communication and documentation skills.
PREFERRED QUALIFICATIONS
• BS, MS Degree in Computer Science, Electrical Engineering, Telecommunication, related engineering
field or equivalent experience.
• 2+ years' experience on Hadoop/Big Data with at least one fully executed project using Hadoop
technologies. [[Cloudera, Hortonworks, Big Insights, etc.]
• Prior experience working in technology companies delivering analytics products is a huge plus.
• Prior experience working as a development DBA leading performance tuning efforts.
• SOA Web Services experience.
• Experience with Columnar databases.
• Knowledge of big data tools such as Spark, Pig, Hive, HBase, Map reduce, etc..
• Experience with Cloud technologies such as AWS, EC2, etc….
Regards,
Amit Kumar Chandel || Technical Recruiter || Damco Solutions Inc.| T: +1 609-236-3030/+1 609-246-5602
2 Research Way, Princeton NJ 08540 | E: | W: www.damcogroup.com
Ensuring Success, Always
Please consider the environment before printing this e-mail
You received this message because you are subscribed to the Google Groups "CVMSCRM" group.
To unsubscribe from this group and stop receiving emails from it, send an email to cvmscrm+unsubscribe@googlegroups.com.
To post to this group, send email to cvmscrm@googlegroups.com.
Visit this group at https://groups.google.com/group/cvmscrm.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment