This project will deliver a Maximo Business Intelligence solution for the purpose of analyzing asset and maintenance program data in support of the calculation of Financial and Operational Performance Metrics as well as deep analytics to assess asset maintenance program effectiveness.
The project will enable:
- Identification of the drivers of the cost of maintaining, repairing and replacing assets
- Development of dashboards and tools for decision making and analysis
- Establishment of rich data sets to feed asset planning and replacement models
- Development, publication and analysis of performance indicators
Our Client has identified the following business objectives that it expects the project to achieve in support of their Asset Management practice:
- Expand access: Provide a scalable Business Intelligence solution that enables staff to access, exploit and analyze detailed information about assets and asset maintenance activities
- Build data literacy: Develop a data dictionary including functional and physical data definition and derivation/origination to support informed analysis and discovery.
- Extend analytical boundaries: Enhance reporting capabilities by inclusion of data elements that are not addressed in the current stop-gap environment
- Build common understanding: Foster standardization of reporting results across business groups through adoption of a common reporting source and method.
- Provide business continuity: Deliver in the new solution commonly-used analytical and management reports that are currently delivered via labor intensive stopgap measures.
- Bridge information silos: Identify potential data integration opportunities across sources and subject areas (e.g. Financial, GIS, SCADA, Construction)
Measurable Project Objectives and Success Criteria:
The project will achieve the following business objectives:
- Expose existing Maximo asset and work management data for analysis
- Decommission the current stopgap framework for analytical reporting including custom Maximo reporting objects
- Identify, document, and potentially exploit data integration opportunities
- Develop a data dictionary for all provided content including functional and physical data definition and derivation/origination
Role: Architect, Design, development, testing, deployment and automation of end-end, source-target-user ETL and ELT data pipelines over complex on-prem and off-prem platforms.
- 15+ years in IT and at least 10 years of development expertise in creating data-centric solutions on Microsoft/Oracle/Azure/AWS stacks
- Understanding of traditional data warehouse and how to migrate to modern world technologies
- In-depth understanding of data management, security, networking standards, hosted and managed services.
- Strong experience in building the infrastructure required for optimal extraction, loading and transformation of structured and unstructured data sets
- Experience in defining Data standards, Data governance and lineage, and Data migration between database technologies
- Strong knowledge of machine learning including pattern recognition clustering, text mining, etc.
- Experience/ keen interest in exploring latest technologies and programming languages
- Experience with big data tools: Hadoop, MapReduce, HBase, oozie, Flume, MongoDB, Cassandra and Pig
- Experience and thorough, full-stack knowledge of cost-optimized cloud deployments spanning compute, network and storage.
- Knowledge of and experience with cloud data solution offerings (In the case of Azure for example: Azure Data Lake, Data Factory, Data Management Gateway, Azure Storage Options, DocumentDB, Data Lake Analytics, Stream Analytics, EventHubs, Azure SQL, etc)
- Experience of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Experience with NoSQL databases, such Cassandra, MongoDB, CosmosDB.
- Experience working with DevOps tools: ADO, Git, Jenkins, Dockers, etc
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Solid understanding of data warehouse principles and multi-dimensional data modeling concepts, source to target mapping and data integration architecture
- Strong experience with analysis and visualization tools; development/setup/configuration of R, Power BI, Tableau, etc
- Meet with business analysts and subject matter experts to understand and document business requirements
- Translate customer requirements into unambiguous, scalable, robust and flexible technical solutions for implementation
- Create and maintain architecture diagrams, data models, mapping documents, business rules, data flow diagrams and other design related artifacts
- Assist the data warehouse team in designing efficient processes to load and manage data, including assessment of data quality in the source systems and implement appropriate business rules, data mappings, and transformation rules
- Actively participate in code reviews, unit testing, system integration testing and remedy solution defects
- Analyze and troubleshoot production issues quickly to ensure system uptime meets service level agreements
- Clearly and concisely communicate status of all assigned tasks to the project team, stakeholders and management
- Demonstrated ability to meet tight deadlines, follow development standards and effectively raise critical issues with the team
- Must be a self-starter who can work independently yet be a strong team player with excellent attention to detail and customer service skills
- Ability to prioritize and complete technical tasks with minimal instruction and supervision
- Establish strong working relationships with coworkers, customers, project team, vendors and across all levels of the organization
- Strong oral and written communication skills to convey technical details to non-technical staff and customers
Reference Number: 5231