All aspects of management as it relates to the federal CIO role.

Agile Advocate: Valuing Outcome over Output

The Agile Advocate is a series of thoughts or musings on the agile movement and DevOps in particular. It was motivated by the seminal work of social philosopher Eric Hoffer, "The True Believer" (1951) — at least in structure. It's not meant to be a manifesto, but simply a series of thoughts and reflections coming from over three decades of developing and delivering software and systems of a wide variety to an even wider variety of users and customers. My aim is not to evangelize or convert, but to provoke and stimulate discussion.

“We need more product!” “No, we need more quality product!” How many times have you heard that? But do we really need more product? Or more importantly, does our customer simply want more product, even more quality product?

Or does our customer want a product that improves their ability to perform their job? A product that makes a significant and substantial improvement in the way they work on a day-to-day basis. Something that provides a positive outcome, not simply another shiny rock they can pretend to use and simply put on the shelf — next to the other shiny rocks. 

Artificial Intelligence

CXO Tech Forum

More and more organizations claim to be pushing product at ever-increasing rates with daily or even hourly deployments. But what are they really delivering?

What I’ve seen, and even been a part of myself, is geared more toward product and volume rather than being customer and outcome centric. Here’s some of the primary impediments I’ve observed hindering organizations from providing outcome that positively impacts their customers lives rather than simply more shiny rocks.

Degrees of Client Separation

It never ceases to amaze me how organizations, both public and commercial, with direct access to their customers fail to take advantage of it. As a former product manager, I worked very hard and even paid for customers to provide feedback and share their priorities through focus groups and product conferences. And the reason was very simple: This feedback was our guiding light along the path to delivering not only a quality product, but more importantly, a product that made a significant and positive outcome in performing their job.

If you’re developing a product or system for an end user, identify a customer or customers who can serve as a point of contact and provide representation for the customer target. Develop real, substantive relationships with those individuals — no matter how painful it may be for either side.

If you can’t get a one-on-one relationship, then strive to keep the degrees of separation from your customers to an absolute minimum. When the degrees of separation start getting to two and even three degrees, then you stand the risk of losing touch with the customer’s real needs and priorities. And when that happens, you're forced to guess and speculate and the ability to course correct based on real customer feedback is significantly hampered, if not eliminated.

Pretend and Arrogant Certainty

We love certainty. And why not? It provides us with a sense of security and well-being. When a manager stands up and confidently proclaims her certainty in the outcome of a particular endeavor, we applaud her leadership. But if we really dig into that proclamation, you’ll see her certainty is some mix of arrogant and pretend certainty.

The pretend certainty comes simply from the fact that we expect those in leadership and management positions to “know” or be certain about where we are going and where we are going to end up. Because to say you don’t know would question your ability to lead.

Arrogant certainty comes from just that, arrogance. The perception that somehow you know more than the customer and you’ve figured it all out from the beginning. You’ve stated the end product and charted a course, and will proceed along that course be damn of where it takes us or if it results in an end product that brings any real value to the customer.

No matter where this certainty comes from, it’s in complete contrast to an agile mindset. You need to put aside the desire for certainty and instead adopt a discovery mindset. A discovery mindset understands that while you're starting with a plan, it’s just that — a plan. It’s a starting point with a vision of where you want to go — or at least where you think you’ll end up — as far as you know right now.

With a discovery mindset and agile practices, you’re constantly learning and discovering where your customer wants to go and can react to their changing, shifting requirements and priorities. The discovery mindset is not bogged down by certainty but embraces the idea that plans change and we are never as certain as we claim to be. As a product manager, I was always amazed after delivering some functionality to my customers or releasing another version of the product how different the end product was from what I and the team had original envisioned.

Lack of Feedback Loops

But the discovery mindset is useless if one lacks the proper feedback loops to capture your customer’s assessments. In a previous article, “The Agile Advocate: Vectoring Development Toward Success” I talked about this exact kind of feedback loop in the software development process. It talks about continuous feedback from user interaction with demonstrable, functioning software provided by continuous integration and continuous deployment processes.

Issues and feedback are captured and routed directly back to the development teams to be incorporated into the next build at a cadence and rhythm that supports the customer — whether weekly, daily or hourly. It’s with these continual, fine-grained course corrections we move from an initial vision along a path to a product shaped by the customer. In doing so, we stand a much higher chance of delivering not just a product, or quality product but a product that provides a positive outcome for the customer.

And these feedback loops aren’t limited to the development phase. They should be incorporated all throughout the idea pipeline — from the recognition of an idea through to the retrospective after delivery. Each feedback loop allows an organization to apply a discovery mindset in vetting, crafting and delivering positive outcomes for the customer.

Outcome over Output

If an organization is truly interested in delivering outcome over output, then it first needs to adopt a true agile posture. This starts with ridding itself of the pretend and arrogant certainties and instead adopting a discovery mindset. Accept the fact that while you may have a plan, it’s simply a starting point along the eventual path to a viable outcome. Remove the degrees of separation between you and your customer. Embrace the customer as the source for course corrections ensuring your product is not just another shiny rock but something that provides a positive outcome instead.

And finally, create the feedback loops at each phase along the idea pipeline that will allow you to course correct to your customer’s real needs.

Bots Aren’t People, But Should HR Treat Them Like They Are?

In a whimsical sense, robotic systems performing basic jobs can seem like a human resources dream: They work around the clock at high speeds on mundane tasks nobody else wants to do, and do it without complaint, paychecks or office confrontations. But agencies’ foray into using bots to handle business processes actually raises a host of questions HR departments might not currently be thinking about. Specifically, should they be managed like software programs or as some version of an employee?

For all of its well-known work in developing hardware robots for jobs such as working in space or disposing of bombs, the government has trailed industry in applying software robotics. But agencies are getting into the game.

The General Services Administration a couple years ago began using a using a chatbot named Mrs. Landingham to keep new hires abreast of new forms, discussions and terminology they needed to know. GSA has since put other bots to work, as have state and local agencies. At least five federal agencies have worked on Robotic Process Automation projects, developing automated systems to perform rules-based, repetitive tasks currently handled by humans. One of those agencies, NASA, last year put its first bot to work in its Shared Services Center.

Artificial Intelligence

CXO Tech Forum

Nicknamed George Washington, the NASA bot performs clerical work such as opening email and copying and pasting information into an HR file. Over at the Treasury Department, the Bureau of the Fiscal Service's Office of Financial Innovation and Transformation is running two pilot programs to explore how RPA and distributed ledger technology (i.e. blockchain) could improve federal financial management.

Jobs on the Line

The introduction of bots in any form is raising a number of concerns, the first being whether they will eliminate human jobs. And they will, according to projections, just as automation has done at least since the advent of the assembly line. But while some dour predictions say robots will take a third of all jobs by 2025, there is also the possibility bots could create other, better jobs.

A Deloitte report, “From brawn to brains,” for instance, found while 800,000 low-skilled jobs in the U.K. were lost to automation over a 15-year period, another 3.5 million higher-skilled jobs were created.

But aside from job projections, agencies have to consider how to manage the incoming flux of virtual employees — including whether to treat them as IT assets or pseudo employees.

“These are the types of questions that we’re going to have to grapple with, and they’re completely outside of the conversations we’ve ever had in the past about initiatives like this,” Justin Herman, GSA’s Emerging Technologies Program said during a recent panel discussion in Washington, D.C., GovernmentCIO Magazine previously reported.

While there are obvious differences between real-life and virtual employees, the AI and machine learning aspects of bots – and the jobs they’re able to do – make it a valid question. NASA’s George Washington, for instance, has its own email account and permissions for each application it deals with, like other employees. There also could be the aspects of performance and what virtual employees add to an organization to consider. Earlier this year, the European parliament raised the question of whether to tax robots based on the extent they and AI contribute to a company’s bottom line.

Part of the Team

But perhaps the key element in managing AI bots is that, in government as well as in industry, they must work in collaboration with human employees — one of the benefits cited by agencies using bots is that it frees up human workers for more important tasks.

“Understanding the capabilities of this new workforce and ensuring they work in conjunction with your existing teams is a key activity for HR,” Paul Donaldson, the U.K. practice lead for automation at a sourcing consultancy called Alsbridge, told HR Magazine.

Human-machine teaming, in fact, is a primary goal for government agencies making use of AI. The Defense Department’s Third Offset Strategy includes a focus on human-machine collaboration in systems such as the F-35 Joint Strike Fighter — which gathers and analyzes massive amounts of data before presenting it to the pilot to aid decision-making — or in swarming systems that combine of the efforts of manned and unmanned aircraft. The Defense Advanced Research Projects Agency last year launched its Agile Teams program, looking to use a mathematical approach to optimize hybrid human-machine systems.

Agencies are seeing the benefits of RPA and other forms of AI-based automation as a relatively easy and inexpensive way to improve performance. But as bots and other systems become more prevalent, managing those systems become a more pressing question.

“The fact of the matter,” states the HR Trend Institute, “is that HR departments currently have important questions to face, and they can no longer afford to look the other way.”

Countries Compete to be Digital Transformers

The pursuit to digital transformation and adoption is a global one, and as technology reshapes our everyday lives, countries compete to keep up. They’re developing new tech, automating manufacturing and preparing the right skill sets for the future.

But some countries have their strengths in certain areas of IT, and others climb the ranks. The Organisation for Economic Co-operation and Development pulled from “200 indicators drawing on the latest internationally comparable data” to form its 2017 OECD Science, Technology and Industry Scoreboard. It uncovers the countries on top and shows how the transformation is affecting science, innovation, the economy and citizens’ lives so governments can prepare policies for a digital era.

The G20 includes: Argentina, Australia, Brazil, Canada, China, France, Germany, India, Indonesia, Italy, Japan, Mexico, the Russian Federation, Saudi Arabia, South Africa, Korea, Turkey, the United Kingdom, the United States and the European Union.

Artificial Intelligence

CXO Tech Forum

Science, Innovation and the Digital Revolution

  • The U.S. is a leader in the internet of things among the G20 economies. It had the highest number of machine-to-machine communication, or SIM cards, per person, in June 2017, but ranks sixth globally.

  • In terms of SIM cards in machines, China had the most in 2017, accounting for 44 percent of global M2M connections and three times the U.S. share.

  • The U.S. produces the most scientific documents on machine learning. It has the biggest share of the world’s top 10 percent of most-cited scientific publications — but that share is declining as China produces more (and better-quality) scientific research.

Growth, Jobs and the Digital Transformation

  • The U.S. lags in robot intensity (measured as the number of robots used in a sector divided by the overall value created by that sector). The U.S. measures at one-sixth of the robot intensity in Korea and one-fifth of that in Japan. Korea and Japan lead in robot intensity, and China is catching up to U.S. levels.

  • From 2010 to 2016, the U.S. had the largest net employment gains in the OECD of more than 12 million jobs, topping Turkey, Germany and the U.K. These net gains are recorded in wholesale and retail trade, business and public services, and manufacturing and construction.

  • China, Chinese Taipei, Japan, Korea and the U.S. lead the way in developing cutting-edge digital technologies.

Artificial Intelligence

  • AI is also growing, as the number of technologies patented in the 5 top intellectual property offices rose by 6 percent a year from 2010 to 2015.

  • Japan, Korea and the U.S. accounted for more than 62 percent of AI-related IP5 patent applications, and European Union countries account for 12 percent of the top AI inventions. These shares have actually dropped from previous years because of a rise in Chinese filings.

  • From 2012 to 2014, corporations based in Japan, Korea, Chinese Taipei and China accounted for 70 percent of all AI-related inventions belonging to the world’s 2,000 top corporate R&D investors and their affiliates. Firms headquartered in the U.S. accounted for 18 percent of those inventions.

FITARA Scores Reveal Grades Your Mother Wouldn't Be Proud Of

Federal IT modernization is a work in progress, requiring leadership buy in, acquisition reform and a culture change — but adopting cloud services and full network visibility are proving to be difficult obstacles to overcome.

The Federal Information Technology Acquisition Reform Act Scorecard 5.0 was released Nov. 14 with not-so-promising results. A fifth category was added to assess agency management of software licenses by megabytes, which negatively impacted final grades. Only three agencies improved their FITARA score from June to November, 15 stayed the same and six worsened.

“So progress is being made, just not as quick as it should be and needs to be,” Rep. Will Hurd, R-Texas, said in the Nov. 15 Oversight and Government Reform IT Subcommittee FITARA hearing. Seven agencies have software license inventories, and the others without inventories received a failing grade.

Artificial Intelligence

CXO Tech Forum

But this isn’t the only trouble area.

According to Dave Powner, director of IT management issues for the Government Accountability Office, agencies struggled the most with data center optimization on scorecard 4.0, largely because a metrics category was added, rather than it just being based on savings.

The departments of Education and Housing and Urban Development are out of the data center business, but 19 agencies still have a ways to go.

“There’s about at least a third of the agencies that project they’re going to be nowhere near optimizing their centers, and they ought to be looking to outsource that and go towards the cloud for any of those data centers,” Powner said.

And with scorecard 5.0, adding software licensing didn’t help.

“When you have 17 agencies getting F’s because they don’t have a software license inventory, that’s a key reason for poor grades, too,” Powner said, though he added he has a hard time understanding why.

“We did a report four years ago that told agencies they should get software licenses, it was in FITARA . . . I think it’s inexcusable that we do not have software license inventories at this point in time,” Powner said.

Two agencies represented at the hearing face these specific challenges:

Energy Department

The department’s score dropped from a C- to a D+. It has historically struggled with chief information officer tenure and failed software license inventory, but perhaps more urgent is its F in data center consolidation.

“Your score went down, not up, which suggests regression,” said Rep. Gerry Connolly, D-Va. He expressed concern over the department’s resistance consolidating its national labs’ data centers.

According to Energy’s CIO Max Everett, the department has closed 84 of 289 data centers since 2010 and plans to shutter 11 more by the end of fiscal year 2018. But these numbers are disappointing.

“We have too many data centers that we don’t have a handle around,” Everett said. “We got to move to the cloud, and we have to do it faster.”

The department is working to move commodity and business services to the cloud, and is piloting data center management systems in some labs in hopes of expanding departmentwide. Everett said these systems can help identify the data centers it doesn’t need anymore.

And when asked if Everett knew everything inside his network, he said no.

“My assumption is if you have a number of agencies that don’t understand what software they have on their system, they also don't know what hardware they have on their system, and that introduction of unknown vulnerabilities is scary,” Hurd said.

Connolly also urged Everett to strive to exceed the department’s goal of closing 11 data centers.

“If you don’t set heroic goals, stretch goals, nothing happens,” Connolly said.

The Small Business Administration

SBA improved from a D- to a C-, but also struggles with CIO tenure. Since 2004, SBA has had 10 CIOs, and high turnover does not help IT modernization efforts. Yet, more pressing is SBA’s F in software license inventory, which the administration plans to complete by early 2018.

SBA CIO Maria Roat said the administration is tackling software licenses in three ways. The first is to reduce the footprint of duplicative software. The second is to cut the number of licenses, and to provide and assign the right level of software licenses to those who need it.

The third is to fully understand all SBA’s software, which Roat started the process of doing a couple months ago.

“We’ve already embarked on getting our arms around our licensing, in particular as we’re moving into the cloud,” she said, and the monitoring tools are already in place.

“A year ago, I didn’t have visibility into the entire enterprise, I do now,” Roat said. This insight allows her to see what licenses are out there, what’s deployed in the cloud and on desktop systems.

So far, the FITARA scorecard includes five of the seven areas of IT reform in the legislation. According to Hurd, scorecard 6.0 will measure whether agencies have established working capital funds as authorized by the Modernizing Government Technology Act. Eventually, Hurd wants to see the scorecard evolve from a FITARA scorecard to more of an overall digital hygiene score for agencies.

GCIO Focus: 6 Steps to Becoming Truly Customer Obsessed

Executives know the importance of improving customer experience and moving toward the day when all strategies, plans and tactics align to enable true obsession on improving the value delivered to customers. Lawmakers also understand the importance of this and have enacted legislation that aims to clear regulatory hurdles to become customer obsessed.  

The question is, where to begin this journey?

The answer depends on where you are today in your maturity of becoming customer obsessed. So, clearly then, your first step needs to be assessing where you are currently. Properly assessing that requires a comprehensive approach – it cannot be done by simply improving experiences, satisfaction scores and other isolated metrics. Rather, a well-done assessment evaluates the whole service delivery ecosystem, thereby providing the baseline and information needed from which to build a successful strategy.  There are six critical components to determining your starting point and strategy:

Artificial Intelligence

CXO Tech Forum

  • Structure: A key element in developing your strategy is to coordinate an organizational structure that breaks down existing silos and allows for seamless collaboration.

  • Culture: While executive support and training are important, the core change is to evolve to an organization always prioritizing the customer and customer’s needs.

  • Skills: Like culture, training is important, but shifting hiring practices from mostly skill based to hiring the customer-obsessed mindset needed for success becomes more important.

  • Metrics: The importance of measurement cannot be overstated and it is crucial to develop both executive-level metrics measuring the delivery of value to customers as well as functional-level metrics that tie to those executive-level measures.

  • Processes: This is largely a choice to decide to work differently together. Yes, there are challenges, but if leaders decide to form executive-level partnerships in support of working differently and facilitate and inspect their teams are working collaboratively, progress will happen. It is also important to adopt a common approach to experience design that connects ideation, design and development.

  • Technology: While it is no secret the government is embarking on a digital transformation, it is extremely important to move toward focusing increasing time and money on systems that engage and serve customers. Successful organizations ultimately operate with largely commoditized or standard internal systems and much more customized and flexible employee and customer-facing assets.

This first step in becoming an organization aligned and relentlessly focused on delivering value to customers is not easy and requires strong leadership and leadership coordination. However, your ability to subsequently create a winning strategy to become customer obsessed depends on it.

GCIO Focus: 4 Steps to a Solid Data Strategy

If you follow the trail of any trend in government IT, you will ultimately arrive at the same place that will determine your relative success — data. Whether it is current initiatives like cybersecurity, citizen experience or IT transformation or emerging capabilities like artificial intelligence, data is the foundation.  

To execute these or any other initiative efficiently and effectively, you need to begin with clean, available and current data. To build this foundation, every organization needs a data strategy that not only meets the needs of the business today, but also provides the flexibility to adapt as business needs change in the future.

The right data strategy must provide your business with rapid and comprehensive access to all of the data it needs. Every organization starts from a different place in the process, but all must develop a data strategy that clearly determines where you are today, where you need to go and a road map for how you are going to get there. If you follow the steps below, you can build a data strategy that will serve as the foundation for all of the current and future capabilities that will drive business results:

Artificial Intelligence

CXO Tech Forum

Build the team: As in the great strides being made in areas like agile and DevOps, you need to assemble a team that spans the organization, assembling and coordinating the “owners” of the data and the “users” of the data. This cross-functional working group of subject matter experts will serve to help the organization understand both the business requirements and the ownership across the organization.

Assess the needs: While this certainly includes traditional requirements gathering, it must go a step further to understand and document the business capabilities and processes that create, transform and use the data. This includes current use cases, future use cases, mapping the data needs and aligning the identified capabilities to the overall business strategy.

Analyze the options: In analyzing the options, it is important to identify changes that should be made to the processes for managing data to support the key business objectives. Of course, you also need to settle on a target data architecture that supports the objectives and the new or modified processes. Finally, you need a plan to implement the selected architecture options.

Prioritize the options: This critical step requires you share the selected options and associated implementation plans with the key business leaders you will need on board for short and long-term support and funding. It is also valuable to test your options with others not as yet involved in the process to get an outsider’s feedback. Final prioritization should be done leveraging a framework that can “score” each option via agreed upon criteria, focused on the business value each will deliver.

To build efficient and effective services — for today and the future — you need to have a data strategy that supports building bricks (services) out of clay (data). Following these four steps will help you get there.

Read Mark Western's previous columns:

Driving Your Strategy Through the Data Mountain

Prioritization Key to American Technology Council's Plan to Modernize Government IT

5 Steps to Digital Business Transformation

Customer Experience Act Focuses on Outside-in Approach

Measure the Wrong Things, Get the Wrong Outcomes

Agile Advocate: The Feedback Loop

The Agile Advocate is a series of thoughts or musings on the agile movement and DevOps in particular. It was motivated by the seminal work of social philosopher Eric Hoffer, “The True Believer” (1951). It's not meant to be a manifesto, but simply a series of thoughts and reflections coming from over three decades of developing and delivering software and systems of a wide variety — to an even wider variety of users and customers. My aim is not to evangelize or convert, but to provoke and stimulate discussion.

In a prior article, “Top 3 Impediments to Agile,” I listed the lack of a comprehensive continuous integration process as one of the hurdles to IT in achieving an agile posture and delivering at the cadence that supports their organizational objectives. In particular, I highlighted the need to clearly specify your CI requirements and ensure the implementation provides the functionality and support to produce viable release candidates.

Your CI process and environment needs to support your development teams as they continually develop the functionality for your end user. But effectively getting to the product best for your users takes more than just a comprehensive CI.

Artificial Intelligence

CXO Tech Forum

The Product of Our CI

So, you’ve done the hard work and you’ve spec’d out the requirements for your CI and the supporting environment. You went that extra yard and ensured the implementation supplied the functionality and operational capabilities that meet those requirements. You’ve diligently tested and validated the CI implementation. It’s provisioned via IaC and scales against load. Unit, integration and even some level of performance testing can be performed. Static code analysis will drive quality and code metrics can be gathered and reported to ensure quality is trending upwards.

Before you know it, your development teams are on board and producing their first builds. The code is being built, compiled, tested, packaged and integrated. The teams are proceeding toward their deliverables based on their interruption of the requirements and acceptance criteria. At the end of the sprint, we’ll have a look at what they’ve produced and see how close they’ve come to what the product owners had specified and envisioned.

Why Gamble?

But why wait until the end of the sprint (or even the release, if you really want to live dangerously) to see if development is on track? What about validating application and the workflow along with testing the functionality as soon as possible? Or as soon as a specific piece of some functionality, something you’re particularly interested in or was responsible for, is available?

Well, with your well-functioning CI process, there’s no reason you can’t. We just need one more thing — a feedback loop.

So, what exactly do we need to do to get that feedback loop?

Well, first we need to establish an environment, where the CI product can be deployed for validation and testing. This new environment should be in addition to any integration environment used by the CI process. If developers are adhering to the practice of regular commits to the code base, then this integration environment will be updated and changing with each commit and prove far too dynamic for POs and testers to properly interrogate and validate functionality. I refer to this new environment as the Release Candidate, or RC, environment as it will host the most recent release candidate generated by the CI process.

Triggering Reviews

The decision to deploy an RC should be done by POs when they determine a release candidate is functionality complete enough or some functionality has recently been added that they want to review and provide feedback on to the development team.

But first, POs will need information to even trigger that deployment.

First, there has to be notifications sent out by the CI process upon the successful completion of a build. This is something that should already be in place with your CI, and if not, it’s easy to implement.

Secondly, there must be a comprehensive and continually updated document containing the list of features and bug fixes included with each build, commonly referred to as release notes. These release notes should be created automatically with the CI process by extracting the relevant commits and comments from the VCS.

The effectiveness of this process and the quality of the release notes relies on developers posting not only quality comments but also relating them to a story and/or task identifier. Together, the identifiers and comments will allow POs to better understand what’s in a RC and determine if it’s something they’d like to interrogate and validate.

If it is an RC of interest, it’s tagged and deployed to the RC environment and available to POs and testers. During a review, POs need to record any issues and enter them in the project’s issue tracking system. These issues need to be categorized and easily identified as “RC.x.x.x feedback” and traceable to a specific RC. It’s this careful review and detailed feedback on the current functionality that provides the crucial input into reviews with the development teams.

Fine Tuning

After a review of the application state and recording their assessments, POs and development teams should meet to review and discuss. It’s here POs can provide the critical and timely feedback on where they see the product going and how it may differ from their vision. It also allows the development teams to share with the POs why some functionality and features came out they way they did and offer possible alternatives.

These reviews should not be seen as a witch-hunt, nor as missteps by either party, but as an invaluable opportunity to fine tune and vector the product and guide it to the end state that will best serve the end user.

The outcome of these meetings should produce new tasks and directives to the development teams, which should be reflected in their backlog of work and eventually make it into upcoming release candidates for review.

This process continues until the POs and testers are satisfied the functionality meets the expectations for the release. We can then cut a release branch and start migrating our release candidate down the deployment pipeline to production.

Steer Early and Often

Communications is challenging — especially when attempting to convey complex ideas and concepts. You should expect misinterpretations and misunderstandings resulting in gaps in implementations and expectations.

But as any engineer will tell you, the earlier these are identified and remedied, the better off you’ll be in both costs and the end product.

When we marry DevOps processes and practices with agile methodologies to produce an effective feedback loop, we have the opportunity to identify these shortcomings and misinterpretations early and often. It also gives us the opportunity to apply fine-grained adjustments to the development effort and in the process, we stand a far greater chance of delivering the product we and our end users expect.

Read William Drew's previous columns:

DevOps, Agile and Microservices

Paving the way to DevOps

Musings on DevOps

What is Infrastructure as Code and Why Do I Care?

Including Ops on the Journey to the Cloud

What's in Your Continuous Integration?

How to Involve Operations in a DevOps World

Top 3 Impediments to Agile

How One CIO Started a ‘Rebel Alliance’

Typically, when you start a new job, the last thing you want to do is stir the pot. But challenging the status quo is exactly how one government chief information officer built a coalition of like-minded people to drive change departmentwide.

After a long Navy career, Chad Sheridan transitioned into the civilian world in 2011, as CIO of Agriculture Department’s Risk Management Agency. In those early days, he didn’t know anyone there, but somehow ended up spearheading an important agencywide IT coalition, cutting through the department’s federated tension.  

Chad Sheridan Sheridan didn’t completely agree with how USDA’s CIO office ran the department when he arrived. At that time, Sheridan said, the department CIO Christopher Smith believed he should be in charge of everything.

Artificial Intelligence

CXO Tech Forum

"And there were some good reasons behind that,” Sheridan told GovernmentCIO Magazine, as some areas in the department were still too federated. And while Sheridan didn’t report directly to the CIO, he was still given the “I’m in charge” talk.

And during that time, USDA’s CIO Council, which consists of department-level folks and bureau-level CIOs, also wasn’t very collaborative. According to Sheridan, the council communicated its plans and  wants from staff via PowerPoint presentations, rather than through robust dialogue or collaboration.

So, rather than fight the CIO or the council, Sheridan began periodically meeting with his peers and other smaller agency CIOs for lunch.

Finding Commonalities in Smaller Agency Concerns

“We didn’t really like any of the things that were coming down,” Sheridan said.

At the time, the Risk Management Agency had 500 end users and the department as a whole had more than 100,000. And because his agency and the other smaller USDA components — Sheridan calls them “smalls” — didn’t consume enough of the end user services, they felt their opinions and pushback got ignored.

“So if it comes to anything end user related, I’m hosed, because who would listen to me? I’ve only got 500 people; I’m not their major customer,” Sheridan said.

The smalls felt unheard because at the time, USDA was looking to improve departmentwide end user services by combining the larger agencies with the most end users; the Client Technology Services and the Forest Service; and gaining insight from them. Together, these agencies accounted for two-thirds of the department end users.

“If they align those two, the rest of us smalls were just going to get run over,” Sheridan said, “if we all didn’t band together.”

Coming Together for Greater Good of IT

Sheridan formed the “small agency CIO group” to collect the input and perspective from all the smalls, composed of the remaining one-third of USDA end users. He began building on those lunches he was arranging early on.

“We all had similar concerns: We want to do the right thing, but nobody’s listening,” he said. The group met every two weeks “just to watch our own backs,” and make sure everyone received the same messages from department leaders and had the same issues — power in numbers.

Among a few joining Sheridan to drive the group was Rory Schultz, deputy CIO for the Food and Nutrition Service at the time. He coined the term “rebel alliance” for the group because all involved were opinionated and felt their voices weren’t being heard.

Realizing Rebel Alliance’s Full Potential

It wasn’t until Smith left the department and the incoming CIO Cheryl Cook stepped in with some mission-driven members the work the rebel alliance was doing finally was acknowledged. For example, the rebel alliance took on the IT portion of a departmentwide shared services reform effort when it felt the initiative wasn’t getting IT involved.

Soon after, the rebel alliance and its eagerness to help IT initiatives morphed into legitimacy.

“I started hosting what was called the CIO Council Advisory Board,” Sheridan said. “It had no charter, had no authority, had nothing.”

But getting that off the ground came with its challenges. Cook and other leaders supported the effort, but there was still a heavy cultural resistance to a “collaborative, community organized-type environment.”

That took time to change, Sheridan said, but “we weathered it, I just kept going.”

Evolving into Today’s CIO Council Advisory Board

Ultimately, that small agency CIO group became the nucleus of what is now the department’s CIO Council Advisory Board, providing an IT perspective for USDA-wide administrative services improvement.

The board is chartered in departmental regulation and has equal representation, elected voting members and follows parliamentary procedures. Sheridan was elected as vice chair twice and currently serves as such.  

“I know how to pull people together, to get things done even if I have zero authority, zero budget, zero mandate,” he said. “It’s force of will, I’m not going away and I’m going to keep charging until I get fired. And if I get fired for trying to do the right thing and bring people together to solve a problem, then it was meant to be and I can live with that.”

GCIO Focus: 5 Steps to Digital Business Transformation

Digital business transformation is not just a technology project. Of course, technology is an important component of transformation, but the real focus should be on getting to an end state that enables the business of government to continually and rapidly act and adapt to meet its business and mission objectives.   

When embarking on this journey, transformation efforts may initially focus more on internal operational excellence, improving digital customer experience, or a combined hybrid approach depending on the organization. Regardless, successful digital business transformation will ultimately require digital operational excellence and digital customer experience mastery.  

Accomplishing this requires an organization successfully navigate five essential steps to digital business:

Artificial Intelligence

CXO Tech Forum

Assess: A critical first step is to determine your organization’s preparedness for digital transformation through an assessment of culture, organization, technology and measurement. Important things considered in this phase include whether you have the right skills, structure, incentives and metrics.

Build the team: It is important not only to have leadership support, but also to have alignment throughout the cross-functional team(s) that will execute the transformation. These teams support the digital transformation by helping align the organization’s’ values, goals and strategic vision and enable the open exchange of information and ideas to drive holistic problem solving.

Innovate: Innovation in digital transformation can take many forms, but the most impactful and important innovations are related to business processes — streamlining, adapting and creating processes to enable transformation. Innovation must also be structured in a manner that enables sustained strategic innovation to occur.  

Measure: Whether you begin your transformation with an inside-out operational focus, an outside-in customer experience focus, or a hybrid approach, it is essential to track metrics that demonstrate the business value your transformation is creating (or not creating). This approach allows for rapid adjustments to how you prioritize your investments and efforts.

Embed: The ultimate goal is to embed the previous four steps into your organization, to make it the way you operate naturally on a daily basis. This move allows you to accelerate and scale your digital transformation initiatives throughout the organization.

Digital business transformation is a journey and not a one-time event. Following the five steps above will help your organization successfully transform to the agile and adaptive digital business.

GCIO Focus: Customer Experience Act Focuses on Outside-in Approach

In May 2017, Sen. Claire McCaskill, D-Mo., introduced a bill known as the Federal Agency Customer Experience Act of 2017. If enacted in its current form, the bill would put more consistent attention and effort on federal agencies’ taking more of an outside-in approach to delivering citizen services.

In short, the bill would require the chief performance officer or other senior accountable official for customer service at each covered agency to collect voluntary feedback from citizens. Those results would be made public online and via an annual scorecard that also includes an “analysis of administrative and legislative barriers to improving service delivery.”

Enacting this or similar legislation is the first step in implementing much more comprehensive and impactful Voice of the Customer programs across government. VoC programs are an organized approach for not only collecting customer feedback (surveys are one method), but also focus on rigorous analysis of the meaning of the feedback and, most importantly, leveraging the analysis to improve the organization’s decision-making and service delivery.

Artificial Intelligence

CXO Tech Forum

As the government embarks on implementing this legislation, it is important not to lose sight of additional steps to take to reach the end goal — agile and efficient improvements in services provided to citizens:

  • Listening: The surveys required by the bill are just one of the necessary methods to gain a valid understanding of citizens’ real attitudes and behaviors. Coming to that understanding requires many more modes of both active and passive listening such as interviews, focus groups, social media monitoring, etc.

  • Analyzing: Once collected, the feedback needs to be effectively analyzed to understand and interpret what citizens are saying about the services they are receiving. It's not helpful to know there is a “score” of some sort for a given service. Agencies must understand why citizens are rating services as they are. Further, agencies need to develop a way to effectively communicate and share these results across the organization so action can be taken.

  • Act: While listening and analyzing are necessary, they are pointless if action is not taken to improve. Challenges in the bill’s design in this area have been pointed out related to unintentionally hampering the government’s ability to resolve service issues directly with individuals. The bill’s requirement to report annually on barriers to improving service is notable. Those barriers must be addressed expediently or the listening and analysis effort will have been wasted and, worse yet, not acting on citizen feedback will only further erode citizens' confidence.

  • Measure: Each organization needs to develop the appropriate measures of success from all of the efforts outlined above. While these will differ from agency to agency, it is critical each organization implement metrics that clearly connect changes in satisfaction and service to the desired business/mission outcomes for each service.

The Federal Customer Experience Act is a good and necessary step. To move from hearing how citizens say they feel about their experience to truly understanding citizens, improving their experience and measuring the business impact will require significant additional effort.

101 Constitution Ave NW, Suite 100 West Washington, DC 20001

(c) 2017 GovernmentCIOMagazine. All Rights Reserved.