Q&A with Acuity’s Data Expert

Data Data Strategy

An Interview with Krishnan Srinivasan, Technical Director and Data Strategist


Krishnan Srinivasan is Acuity’s Technical Director and Data Strategist. He has developed secure data architectures and solutions for several federal agencies. Prior to his arrival in Acuity, he held technical and leadership positions in T. Rowe Price, Informatica, J.P. Morgan, BMW, and Citicorp.

The Federal Data Strategy is facilitating changes within agencies to foster greater collaboration.

What would you say are some of the most important challenges facing Federal CDOs today?

I think it really comes down to a change in culture. This is true for both Federal agencies and large enterprises because they face very similar problems. Every large enterprise is made of sub-groups with responsibility for specific functions. They buy good tools and technology to support that function, but as the organization grows, its tasks and the roles of its employees become more specialized. This generally leads to new rounds of purchases with even more specialized tools and technologies that are not necessarily designed for integration.

The data that people store follows the same pattern; it is generally collected for a specific individual function with little thought to integrating or sharing the data with the rest of the enterprise. The larger enterprise then ends up with lots of applications and lots of data stored in “data silos” all over.

To really make the best use of data, you need a culture where data is seen as an asset to be shared. Once people agree on the importance of sharing the data, you can design a modern architecture that supports secure sharing of data and ensures that the right tools are put into place with a governance plan that allows for integration with future technology.

A true data culture means many things – from having a mindset of sharing data with other organizations within the enterprise, to understanding that good quality data is everyone’s responsibility, to adjusting business processes to make use of data and AI, and most importantly, embracing the use of data in decision-making at all levels.

How are things changing today?

Up until recently, very few people were thinking about the potential value of their data to other groups in the organization or to the organization as a whole. Now leaders, especially CDOs, have begun viewing data as a valuable asset. They want to make the best use of it, which requires sharing. When you have multiple groups who are not used to sharing data, and they’re using multiple systems that don’t integrate, the situation can become very complex.

Fortunately, organizations are more open to sharing data today. And today’s technology enables better collaboration and better, more secure sharing of data assets in an organization.

It seems as if the volume of data collected keeps spiraling upward. How does that volume affect the technology needed for sharing?

Small and large sets of data have very different technology needs. When you deal with massive data sets, such as we’re seeing today, you suddenly need tools and technology that can handle that volume. You need the right tool for the right job, and it needs to be future-ready. There is no sign of a slowdown in data collection! But as with any tool, there are trade-offs. Think about a Swiss Army knife – it can do many things, but you don’t choose it when you have a better tool for the job. So when CDOs and CIOs decide on their data architecture strategy, they have to be mindful of the trade-offs and have a deep understanding of what functionality they need to optimize. You have to make a trade-off between speed and processing power when you deal with massive data sets, for instance. So you need to understand the size of your data sets and the needs of your users. It requires balance.

There are also trade-offs for using open-source vs. vendor software. Open-source prevents vendor lock-in, but users can’t always get the functionality they need, when they need it. With open-source, there is no single point of contact for support – it is a community of developers, so you don’t have the guarantee that the feature you need will be added. With a vendor, you can usually have greater influence over the functionality. So there are upsides and downsides to each.

How can organizations make their data more usable?

As large organizations like federal agencies publish more data and open access to it for sharing, they need to understand the needs of the various users. They also need to understand what data they currently have, how good the data is, and then publish this data internally to everyone in the organization. They need to make sure the architecture enables seamless access to all of the data sets – with appropriate controls, of course. Finally, they should be careful not to use more data just because they have access to it. They need to ensure that the data used adds value to the business process.

What is the good news?

Federal agencies are making data usable. They are recognizing data as a very important asset and changing the way they do business. We now have CDOs in agencies who are leading this transformation.

Another very positive change is the addition of socialization tools to enable collaboration across teams. The modern platforms allow users to enhance data and to clarify its origin and meaning. Socialization tools allow those who were once passive consumers of data to provide feedback. The platforms can capture and use that feedback to improve the data. This creates an ecosystem that builds confidence in the data, and that is essential. These tools were not available three-to-five years back.

What about the role of machine learning and artificial intelligence? How are they changing the way federal agencies do business?

Everybody is trying to see how AI and ML will fit in. I see them as enablers that will make everyone’s jobs easier and more interesting. Take the probability of fraud, for example. Using AI, you can score behavior that seems likely to be fraudulent. This allows investigators to spend their time on the cases that show a high probability of fraud. It saves time and effort – not to mention frustration.

Of course, you need good data to train the model. That’s what it will come down to. People who commit fraud try to game the system, so you need to continue to evolve your models.

Any advice to CDOs and CIOs?

Remember you can’t change everything overnight. It takes a lot of effort. Any change has an impact on current users, and the more users, the more impact the change will have. This is why it is critical to solicit feedback from a wide range of users.

Good change management means not stopping what you’re doing. While everyone is looking for long-term solutions, technology will continue to change, and we have to be ready to change with it.