A common data model correlates and normalizes data for deep orchestration and visibility.
I had the opportunity to catch up with Bob Davis, CMO and Jeff Keyes, VP Marketing and Strategy at Plutora to learn about their latest product update to help DevOps engineers turn data captured across different development toolchains into insights and trackable actions to speed development while reducing errors.
Bob and Jeff have observed that new and existing clients are using more of the data provided in the platform to get greater visibility with deeper orchestration over the past 18 months. The pandemic and resultant remote work have driven greater reliance on technology and data to facilitate software development across geographically dispersed teams.
A common data platform is a key to correlating and normalizing data to achieve deep orchestration and visibility which is necessary when reaching more deeply into strategic planning.
Developers have no idea what happens once their release goes into production environments. They’re in different silos. They don’t know how features are performing or if the scope is changing. While there’s a reason to believe that everyone is working on the right priorities, there’s no way of being certain. A data-centric platform with value-stream flow metrics provides information into how things are performing and provides an expansion of planning data.
Application Delivery Challenges
The core challenges of application delivery today include:
Fragmented visibility along the application delivery pipeline -- teams using different tools are unable to see each others’ activity and artifacts.
Lack of visibility at the portfolio level (multiple pipelines) -- leadership and management lack a complete view of the software delivery process.
Poor management and optimization of disparate methodologies -- multi-speed IT leads to friction and bottlenecks, removing the benefits of agile and DevOps.
Weak methodologies to guide continuous improvement -- lack of measurement and management results in a lack of continuous improvement
“There was no view into what others were doing. Information was very fragmented, Each group maintained their own spreadsheets. So, consolidating them into one view to see everything was very difficult.” -- Vijay Dwarakanath, Head of Infrastructure Delivery & Data, Centrica
Today the average customer has five integrations. Managing value streams across integrations requires data. Data enables the discovery and continuous measurement of cycle times to inform value stream maps. Use data in sprint reviews to inspect progress toward goals and improve OKRs that align with the long-term vision of the company. Sprint goals set in planning can be aligned with OKR improvement and recorded with the next target value stream map.
Data-Driven Value Cycle
Enterprises are able to go from workflow to a data-driven value cycle. More than 85% of the transactions inside of the Plutora platform are generated through APIs and integrations, only 9% of the load are keystrokes. This is a function of greater automation, fewer errors, and higher quality code. A data-driven value cycle provides:
Insights and Analysis -- monitoring and observability provide insights into customer reaction to changes and report of value realization.
Portfolio and Backlog -- vision and goals are set and aligned to epics, features, PBIs, and user stories.
Continuous Integration -- code is created, artifact incorporated, version controlled, code is built in a trunk-based manner.
Continuous Testing -- the changes are approved, released, and operated in the live environment.
Continuous Delivery -- monitoring and observability provide insights into customer reaction to changes and report on value realization.
Different metrics are needed for different areas of business. The C-suite is worried about productivity so measuring waste is a good metric. Business leaders are concerned with speed and outcomes so measuring time, revenue, and retention is key. Developer and delivery leaders are concerned with outputs and processes so metrics around story points and deployment frequency are important.
Flow and DevOps Metrics
Given this, the best practice is to measure value streams with flow and DevOps metrics. DevOps and value stream metrics describe how your value stream work is flowing, releasing, as well as its stability. This will include a standard page of metrics including lead time, cycle time, stability metrics like MTTR and change failure rates, and release metrics like throughput and deployment frequency.
Value stream flow metrics lets business owner see how long it takes to go from idea to revenue with:
Lead time/cycle time -- time to customer value from approval to delivery
Throughput and efficiency -- work items completed in a value stream for a given time period
Work efficiency -- the proportion of time that work item is active versus total cycle time
Cycle time -- total time from when work was started to when it was completed
WIP -- the total number of work items active or waiting in a value stream
Work breakdown -- the proportion of work item types (feature, defect, etc.) within a value stream
A known time stealer is doing too many things at the same time. If you point out when this is taking place, you help make people more efficient. The biggest elements of efficiency are work breakdowns due to features and defects, technical debt, and risk with automation and quality ingestion. Treat everything as a work item. In most companies, the same developers are writing features and working on a re-architecture. Value stream analysis will highlight where value is being added versus lost.
Testimonials
Healthfirst began using the Plutora platform at the end of 2019. In 2020, they delivered 25% more releases with 33% fewer defects. Halfway through 2021, they are on pace to deliver 35% more user stories as a function of having more visibility. According to the head of release management, "using data to make informed decisions about software delivery is the way of the future. The data and analytics provided from Plutora’s new capabilities have helped Healthfirst develop tactics that keep the focus on optimizing our software delivery allowing us to deliver more value with higher quality to our Providers and 1.76 million members. The insights have been invaluable to helping us deliver more, faster and with higher quality while working remotely through an unprecedented pandemic."
The head of the Value Stream Consortium noted, "By adding value stream flow metrics to its newly enhanced platform, Plutora is harnessing the measurements that the technology industry really needs. Value stream flow metrics are a core industry standard for VSM which provides organizations with the most information available from a work optimization perspective. The data has always been available through the delivery pipeline and in the DevOps toolchain, but the challenge has been extracting the large amounts of information produced by the diverse range of tools used across teams within organizations. As extreme amounts of data are amassed, teams can now make use of it at scale by applying the value stream flow metrics in the Plutora Platform. This is the future of value streams achieving actionable insights and improving customer value delivery through their digital products."
Comments