This project will deliver a Maximo Business Intelligence solution for the purpose of analyzing asset and maintenance program data in support of the calculation of Financial and Operational Performance Metrics as well as deep analytics to assess asset maintenance program effectiveness.
The project will enable:
- Identification of the drivers of the cost of maintaining, repairing and replacing assets
- Development of dashboards and tools for decision making and analysis
- Establishment of rich data sets to feed asset planning and replacement models
- Development, publication and analysis of performance indicators
Our Client has identified the following business objectives that it expects the project to achieve in support of their Asset Management practice:
- Expand access: Provide a scalable Business Intelligence solution that enables staff to access, exploit and analyze detailed information about assets and asset maintenance activities.
- Build data literacy: Develop a data dictionary including functional and physical data definition and derivation/origination to support informed analysis and discovery.
- Extend analytical boundaries: Enhance reporting capabilities by inclusion of data elements that are not addressed in the current stop-gap environment
- Build common understanding: Foster standardization of reporting results across business groups through adoption of a common reporting source and method.
- Provide business continuity: Deliver in the new solution commonly-used analytical and management reports that are currently delivered via labor intensive stopgap measures.
- Bridge information silos: Identify potential data integration opportunities across sources and subject areas (e.g. Financial, GIS, SCADA, Construction)
Measurable Project Objectives and Success Criteria:
The project will achieve the following business objectives:
- Expose existing Maximo asset and work management data for analysis
- Decommission the current stopgap framework for analytical reporting including custom Maximo reporting objects
- Identify, document, and potentially exploit data integration opportunities
- Develop a data dictionary for all provided content including functional and physical data definition and derivation/origination
Role: Design, development, testing, deployment and automation of end-end, source-target-user ETL and ELT data pipelines over complex on-prem and off-prem platforms.
- Working knowledge in data migration/integration with off-premise / cloud services (Azure, AWS, etc).
- Experience building, administering and managing scalable analytical platforms containing both structured and unstructured data.
- Experience with Full-stack DevOps engineering. Knowledge of compute, network, storage and cost optimized implementations in addition to application development.
- Familiar with ETL/ELT best practices
- Explore new tools and methods to optimize existing way of acquiring data
- Knowledge of big data ecosystem using tools like Hadoop, MapReduce, HBase, oozie, Flume, MongoDB, Cassandra and Pig
- Knowledge of Machine Learning including pattern recognition clustering, text mining, etc.
- Experience with NoSQL databases, such Cassandra, MongoDB, CosmosDB.
- Experience working with DevOps tools: ADO, Git, Jenkins, Dockers, etc
- Experience building the infrastructure required for optimal ETL/ELT process for large data sets in a variety of formats (structured and unstructured)
- Knowledge in version control and change/release management processes. Experience with source control mediums such as Team Foundation Server (TFS), Visual Studio Team Services (VSTS) or Git (preferred).
- Solid understanding of data warehouse principles and multi-dimensional data modeling concepts, source to target mapping and data integration architecture. Foundational knowledge of traditional end-end ETL/OLAP solutions, preferably but not required, on the Microsoft SSIS/SSAS stack.
- Meet with business analysts and subject matter experts to understand and document business requirements
- Translate customer requirements into unambiguous, scalable, robust and flexible technical solutions for implementation
- Create and maintain architecture diagrams, data models, mapping documents, business rules, data flow diagrams and other design related artifacts
- Assist the data warehouse team in designing efficient processes to load and manage data, including assessment of data quality in the source systems and implement appropriate business rules, data mappings, and transformation rules
- Actively participate in code reviews, unit testing, system integration testing and remedy solution defects
- Analyze and troubleshoot production issues quickly to ensure system uptime meets service level agreements
- Clearly and concisely communicate status of all assigned tasks to the project team, stakeholders and management
- Demonstrated ability to meet tight deadlines, follow development standards and effectively raise critical issues with the team
- Must be a self-starter who can work independently yet be a strong team player with excellent attention to detail and customer service skills
- Ability to prioritize and complete technical tasks with minimal instruction and supervision
- Establish strong working relationships with coworkers, customers, project team, vendors and across all levels of the organization
- Strong oral and written communication skills to convey technical details to non-technical staff and customers
Reference Number: 5230