Are you a seasoned and innovative analytic thinker with extensive functional and technical design and development experience with end to end Data & Analytics towards enabling Data Transformation?
Have you worked as a part of fast-paced product teams using Agile/Scrum methodologies and DevOps practices to build customer facing product features?
If so, PVH Enterprise Data & Analytics group
is looking for you
PVH is one of the world's largest global apparel companies. With a history going back over 135 years, PVH has excelled at growing brands and businesses with rich American heritages, becoming one of the world's largest global apparel companies. We own and market iconic brands like CALVIN KLEIN and Tommy Hilfiger, and, in addition, market a variety of goods under our own and licensed brands. We are over 35,000 associates operating in over 40 countries.
PVH has embarked upon a data transformation journey, building a world class enterprise data & analytics to manage all types of data for consumption across PVH to ultimately unlock business value through hindsight's, insights & foresights.
In your new role within the ED&A (Enterprise Data & Analytics) team, you will have the opportunity to continue the development of an industry leading analytics platform, with opportunities to apply reporting, analytics, data science and machine learning towards e2e analytic capabilities across the PVH enterprise.
The Data Engineer - Enterprise Analytics will be responsible for the development of piplelines that rapidly ingest data and make available to the enterprise for Reporting, Dashboard, Machine Learning & AI capabilities across all product domains (Commercial, Consumer & Supply). Y ou will collaborate with experts in the data & analytics space to define, influence & develop innovative data product solutions. The team is leveraging the best-in-class cloud based data lake & analytics tools, and you will have the opportunity to hone you experience with them and help evolve our application of them.
The candidate will play a crucial critical role in the development of the Enterprise Data Platform (EDP), with a view to providing a full suite of business capabilities ranging from basic reporting to self-serve to executive dashboards, advanced analytics and data science. The candidate will work closely with the teams Data Engineers, Architects and Developers to contribute recommendations for tool choices and provide expertise with regards to best practices for data management, scalable solution design and architecture. The role is hands on and comes with the opportunity to build a world class analytics platform from scratch PRIMARY RESPONSIBILITIES/ACCOUNTABILITIES OF THE JOB:
- Build, test and deploy pipelines using Azure Data Factory, DataBricks, Delta Lake & PySpark
- Assemble a robust and repeatable framework that takes advantage of the Microsoft Azure platform tools, industry best practices, capabilities and functionalities (Azure: Data Factory, Data Lake, DevOps, DataBricks, Synapse Analytics & more)
- Provide recommendations for scalable & efficient design, process and methodology for data ingestion, processing and storage
- Deploy solitions that contribute to the assembly of scalable Machine Learning & AI capabilities as part of data products & related solutions across all analytic workstreams and execute on a plan to achieve that vision
- Co-create solution frameworks with Data Scientists, Product Managers and data/analytics Engineers to understand data & analytic needs and drive iterative & innovative evolution of advanced analytic capabilities, taking into account Governance, Security, Permissions and Cost
QUALIFICATIONS & EXPERIENCE:Experience:
- Build a strong and transparent working relationship with the analytic product and technology teams on communication, prioritization and development against commercial, consumer and supply product roadmaps
- Understanding business use cases and technical requirements and be able to convert them in to technical design which elegantly meets the requirements
- Writing and reviewing solution & capability documentation, explaining overall business solution architecture, framework, and high-level design for the developers
- Ensuring adherence to sound architecture principles, engineering guidelines and coding standards
- Mentoring team members in projects and helping them keep up with new technologies
- 6+ years' experience of increasing levels of responsibility in data & analytics solutions in the retail Industry
- 3+ years of deploying ML & AI solutions, preferably in a retail setting and process transformation to define data driven solutions to critical business needs
- Experience in ML Ops, data lake, pipeline and data science environment, visualization tools, solution architecture and development using best practices and modern frameworks
- Excellent troubleshooting, analytical and debugging skills
- Experience in release management and Continuous Integration & Deployment methodologies
- Proficient in communicating and driving technology solutions that support business objectives.
- Ability to work effectively and manage partnerships with business and technical stakeholders all levels of the organization.
- Experience with agile development methodology and collaboration tool such as Teams, JIRA, Confluence
B.S . degree in computer science or similar degree or equivalent work experience Skills: Required:
- Expertise working in a Cloud environment ( Azure preferably, AWS, Google) - Data Lake, Reporting/Visualization tools and its components to enable enterprise data solutions - Azure Data Factory, DataBricks, Delta Lake, PySpark
- Expertise in conceptualizing, defining & enabling end to end data capabilities and solutions in an enterprise data model
- Able to create high quality design & solution documentation including architecture and data flow diagrams.
- Comfortable presenting to senior leaders and communicating directly with the business
- Experience in creation of solutions strategy & roadmap for a data & analytics function
- Large-scale / distributed computing and analytics (Hadoop/Spark etc.)
- Knowledge of SQL
- Software development / engineering a plus
- Source control & CI/CD and collaboration tools (Git etc.)
- Scaled model testing, integration and deployment
- Microsoft tech stack, Azure preferred
- Agile ways of working (2 week sprints, backlog, retrospective etc.)