Every state and local agency relies on highly specialized software that is terrible. Any type of agency needs software that enables their core mission. And in pretty much every case, that load-bearing software is awful and everybody hates it. The market looks the same for every agency. There's one system that's bad but widely used, a result of Tyler Technologies buying several competitors and merging their offerings into one system. There's another software package made by another vendor that is bad in different ways and also relatively widely used. And then there are 1–2 systems that are built by mom-and-pop shops working out of their garage. All these products look like hot garbage, and all of these vendors are really in the business of lock-in, rather than making decent software. The result is that agencies are limited not by law, but by what their software vendor will permit them to do. I routinely help agencies get out of this trap. There is a hypothetical collective solution: creating escalating software standards across jurisdictions. Markets are rational. Vendors sell bad software because agencies don't demand good software. The solution is to move those demands into RFPs. Economists talk about switching costs, the financial hit that you take by changing suppliers. Let's bring down the switching costs. A big source of switching costs is data exchange. Getting the data out of an old system and into a new system is so daunting that agencies will use bad software indefinitely to avoid the problem. If moving data between software is easy, there may still be contractual obstacles and change-management obstacles, but the part that government is the worst at—technology—is covered. If one county agency or state agency makes these demands of vendors, they will be ignored. But 2 agencies? 5? 10? 50? Imagine that a critical mass of agencies team up and declare, with one voice, that they're all increasing their standards at once, as their existing contracts come up for renewal. Their escalation path might look something like this: 1. In 2 years, any new RFPs for the software are going to require data export functionality and will limit contract durations to three years. 2. Also in 2 years, agencies will team up to build shared software to convert data between the storage formats used by the major vendors' systems. 3. In 4 years, agencies’ RFPs are going to require data import and export functionality in an open, prescribed format, based on what agencies learn when building their data-conversion software. A lack of collective action has created the sclerotic specialized agency software market. Banding together, agencies could reshape the market to support their changing needs. Government needs to be able to deliver at the speed of need. The market will not improve without collective action. This intervention is lightweight, inexpensive, and will result in better software for the entire country, not just those agencies that participate in it.
How Agencies Are Tackling IT Inefficiencies
Explore top LinkedIn content from expert professionals.
Summary
Government and private agencies are navigating through challenges to improve their IT efficiency by addressing outdated systems, high costs, and data management issues. Collaborative efforts, smarter funding mechanisms, and innovative tools are shaping the future of IT infrastructure and operations.
- Create collective standards: Agencies can improve the IT software market by teaming up to demand higher standards for functionality, data portability, and contract terms in their procurement processes.
- Modernize funding approaches: Establish specialized funds and governance frameworks to ensure continuous investment in updating and maintaining IT systems, avoiding technical debt.
- Standardize and utilize AI tools: Invest in clean, standardized data and integrate AI-based tools for better procurement processes, data analysis, and operational efficiency.
-
-
Last week, CIO Journal (https://lnkd.in/d8M3iR53) talked about the concept of cost avoidance and how it has become a "buzzword for holding down headcount." As businesses grapple with economic uncertainty and the ever-increasing complexity of the technology landscape, proactively preventing future expenses has become paramount. Quantifying the value of these preventative measures remains a significant challenge. The very nature of cost avoidance hinges on hypothetical situations, making it difficult to precisely measure the return on investment for cost avoidance. Enter Technology Business Management (TBM), a powerful framework that transforms how CIOs optimize the value they deliver with technology. By analyzing data from various sources, TBM provides a comprehensive view -- app dev, cloud, security, etc. --of IT spending. This increased visibility enables CIOs to pinpoint areas of potential risk and inefficiency, such as underutilized cloud resources or impending software license compliance issues. TBM excels at cost avoidance by providing deeper insights than traditional accounting, ITFM, or FinOps. TBM practitioners analyze product/service demand and consumption, down to component details like type, tier, state, capacity, provider, criticality, purpose, and location. For example, when a global manufacturer's storage team faced a $1M request for additional tier-one storage, TBM analysis revealed that some of its delivery organizations routinely used this expensive tier for non-critical platforms, resilience, and backups. Further analysis showed similar misuse of high-performance cloud storage. The IT organization had been misapplying storage tiers for years, prioritizing performance over business need. Consequently, by reallocating existing storage and aligning high-performance storage with actual critical uses, the company avoided the $1M purchase by nearly five years. Beyond cost savings, TBM fosters a more strategic dialogue between IT and business stakeholders. By aligning IT costs with business outcomes, CIOs can effectively demonstrate the value of their investments, including those focused on cost avoidance. This improved communication strengthens the relationship between IT and the business, leading to more informed decision-making and increased support for critical IT initiatives. TBM moves cost avoidance from theoretical to practical. By embracing TBM and leveraging its capabilities for data-driven decision-making and proactive decision making, CIOs can not only reduce costs but also enhance strategic value their technology delivers for their organization. If you want to learn more, drop me a note here or contact my team at info@tbmcouncil.org. #techvalue #cio #tbm #tbmcouncil
-
When I was working as the Federal CIO a few years ago, we needed to find a new and better way to move the Federal government from outdated and inefficient IT systems to modern ones. Here’s what we did: The specific problem that we were trying to solve was that funding for major IT infrastructure upgrades had a very rocky track record. The annual budget process would often result in long-term projects that got started, then were stopped, and then restarted multiple times (if at all). The result was often very high relative cost, poor functionality, and in many cases, completely abandoned efforts. A terrible way to run things, to state the obvious. To solve this, we created the Technology Modernization Fund. The same concept was used by the General Services Administration to get large, long-term government building projects funded without the political bottlenecks associated with the annual Congressional authorization and appropriations processes. Here is a short synopsis of the idea we borrowed: Congress created an exception for buildings and other major construction projects through a capital funding process that only required a one-time appropriation and authorization as opposed to the government’s annual funding approval process. IT infrastructure applications aren’t so different from funding buildings or other major projects. We decided that there ought to be a similar mechanism for IT infrastructure. How it worked: Under the TMF, agencies applied for funding, outlined their proposed IT project, its intended benefits, what it was replacing, etc., to show how the project would modernize technology. They needed to demonstrate it was going to be secure, more cost-effective, and more appropriate for the agency’s mission. Good governance stuff. It’s now been running for 8 years. When we first started, there was $20 million in the fund. To date, over $1.1 billion. Many successful projects that otherwise would not likely have happened have been funded. What’s important is that there’s now a process and a fund that focuses on making sure the applications and infrastructure are modern and current. Sadly, however, the most recent estimate is that the TMF fund is tens of billions of dollars short of the amount needed just to keep current with the ongoing atrophy that naturally occurs. A lot of people view IT like a building: once built, it’s done. This mindset leads to technical debt and outdated systems. Building owners know there is a bunch of ongoing maintenance needed to make a building livable and meet the current standards. It’s no different in IT. If you don’t fix aging architecture, the whole structure will eventually collapse. I always focused on constantly asking, ‘Are we creating technical debt and security risks?’ Because that’s what will happen if there’s no investment in continuously modernizing IT and keeping systems up to date. This is as true for government agencies as it is for the private sector.
-
The recent #DOGE initiatives mentioned in this article seem to miss the mark, as one of the critical challenges in federal procurement is poor data quality rather than widespread waste or abuse. The new executive order mandating 30-day comprehensive reviews of existing contracts and acquisition policies appears ambitious but potentially unrealistic given current workforce challenges. Instead of building entirely new payment tracking systems, experts recommend improving existing infrastructure like USASpending.gov and agency-specific invoice processing systems. Requiring additional written justifications for every payment would create a bureaucratic burden without improving efficiency. What's truly needed for AI-powered procurement transformation is clean, standardized data. Previous efforts I worked on with ASI Government and PotomacWave have shown that applying artificial intelligence to procurement data can increase accuracy dramatically, but this requires investment in data quality at the source. Rather than developing redundant systems, agencies should prioritize standardizing contract data, reconciling payment information with contract details (extremely important), and implementing AI tools that can analyze spending patterns across government to secure better rates and identify actual inefficiencies. The most practical path forward combines streamlining excessive review layers, simplifying contract closeout procedures (especially for smaller contracts), and establishing government-wide data standards that enable meaningful AI analysis. Only then can federal procurement realize the potential of using large language models to understand spending patterns and negotiate more favorable terms across agencies. Timothy W. Cooke, Ph.D. Greg Giddens Jason Miller #FederalProcurement #AcquisitionReform #GovTech #DataQuality #ArtificialIntelligence #DOGE #FederalContracting #ProcurementInnovation #GovernmentEfficiency
-
The demand for more—more features, more traffic, more conversions—is growing for small business websites. But budgets? They’re staying the same (or shrinking). For agencies, this creates a tough challenge: how do you deliver exceptional results without breaking the bank or burning out your team? In my recent conversation with Itai Sadan , founder of Duda, we tackled this issue head-on. Itai shared how AI is transforming the way agencies operate, making it possible to achieve better outcomes in less time. Here are a few takeaways that stood out to me: ✔️Efficiency is the name of the game. Agencies using AI tools can build websites twice as fast. For example, with Duda’s AI assistants, you can generate alt text for 500 images with one click—boosting SEO without the mind-numbing manual work. ✔️Content creation made smarter. Need pages optimized for specific keywords? AI tools can generate content drafts that give agencies a solid starting point, saving hours of time. ✔️Streamlining maintenance. AI can handle essential but time-consuming tasks, like workflow automation and technical SEO, freeing up agency teams to focus on strategy and client relationships. But here’s the kicker: It’s not about AI replacing agencies. Instead, agencies that leverage AI will have a distinct edge over those that don’t. As Itai put it, “The risk for agencies isn’t being replaced by AI—it’s being replaced by agencies that know how to use it better.” I’d love to hear your thoughts: How are you incorporating AI into your agency workflows? Are you finding it’s making a measurable difference in your results? If you’re looking for more ideas on how to future-proof your agency, check out the full episode—https://loom.ly/NI5FNK4 Thanks to our sponsors Duda