Articles

Building a data strategy for Generative AI

By Manav Gupta, CTO and VP, IBM Canada

January 10, 2024

Artificial Intelligence is making headlines daily. It’s dominating business conversations and shooting to the top of the agenda for many corporates, if not already there. According to IBM’s...

Artificial Intelligence is making headlines daily. It’s dominating business conversations and shooting to the top of the agenda for many corporates, if not already there. According to IBM’s Institute for Business Value, sixty per cent of C-suites will be piloting Gen AI solutions in their organizations by next year.

Earlier this month I sat down with Bell’s Director of IT Delivery, Neel Mehta, to discuss the integral connection between solid data foundations and an effective AI strategy. Neel gave a great analogy of this relationship – “We see data as the new oil, and AI as the refinery.”

Bell’s heritage is about connecting people to one another and to their devices. Today, they’re using a combination of generative and predictive AI to serve their customers even further. For example, predicting extreme weather and environmental events which may cause outages, and getting their workforce ready for them.

Neel shared an observation I’m sure many other enterprises will relate to. That AI is galloping ahead at an accelerated rate, outpacing the rate that clean data can be fed in. That’s where data strategy comes in, and a shift in focus from capturing data, to connecting data.

The deployment of any new technology always comes hand in hand with implementation challenges. During our discussion we touched on what those have been for Bell, which I expect are shared with other organizations navigating their own AI journeys.

  1. Data siloes – Like many enterprises, Bell has traditionally been organised vertically – a structure which doesn’t lend itself to connecting data. The solution?  Building horizontal governance and data platforms so data can be shared between product lines and business units.
  2. Data latency – AI is great for producing insights, but often the work is done through batch processes and the intel is sometimes late. Event driven architecture helps speed this up. Combining network events with customer journey events in real time enables Bell to target the right customer with the right device at the right time.
  3. Data quality – Data governance is important, and the information feeding AI must be reliable. How can this be achieved? By cleansing the data and building the necessary framework to keep it that way when new data is curated.

Data quality is intrinsically linked to one of the hottest topics of AI conversation today – trust. It’s important to keep a laser focus on keeping AI usage fair, unbiased and compliant. Customer transparency is just as important. They should know what you’re doing with data and how long it’s being kept.

We wrapped up our conversation on why it’s important not to get carried away with the AI hype. It must solve real customer pain points, and outputs should always be linked to specific business goals. For organizations who are early in their AI journeys, or thinking of setting out, Neel shared this great advice:

  • Be curious about data and AI and keep experimenting.
  • Be clear how the work you’re doing maps to your company’s broader strategy, and always consider the bigger picture.
  • Learn about data security, it’s incredibly important.

To view this LinkedIn Live session, and learn more about building a data strategy for generative AI, click here.

Article Categories