So strong is the interest around artificial intelligence and its potential to change how we do things today, Gartner predicts that by 2025, 39% of worldwide organisations will be at the experimentation stage of Gartner’s AI adoption curve, and 14% will be at the expansion stage.
Boomi’s chief technology officer for Asia-Pacific and Japan, David Irecki, says AI and advanced technologies streamline workflows and improve decision-making. He adds that predictive analytics helps forecast needs and optimise resource allocation.
“By integrating various systems and automating processes, government services and businesses can deliver more efficient, responsive, and data-driven services, significantly enhancing operational efficiency and citizen satisfaction,” continues Irecki.
Better ways to manage data to drive business insights
According to Irecki, organisation’s digital footprints are also increasingly fragmented. He observes that many organisations are still grappling with the digital fragmentation and complexity that comes along with it.
“Intelligent integration and automation play a huge role in driving business insights, and setting an organisation on the path to success would mean consolidating data from disparate sources,” he comments. “Integrating with business intelligence (BI) tools facilitates data visualisation and actionable insights.
“Effective big data management involves structuring data pipelines for seamless data flow, ensuring data quality, and applying advanced analytics to uncover trends and make informed decisions, driving business growth and innovation.”
David Irecki
Options for increased performance and reliability
Irecki believes in the importance of implementing a multi-cloud strategy to ensure scalable, flexible integration across various cloud platforms, enhancing performance and reliability.
“A robust, real-time, scalable infrastructure plays a significant role. With seamless process automation across clouds, organisations can capitalise on the strengths of different cloud providers while maintaining high availability and disaster recovery capabilities,” he follows up.
Other options besides the cloud
Picking up on the rush to use artificial intelligence – sanctioned or otherwise – McGallen & Bolden Group CTO, Dr Seamus Phan says while commercial AI, such as OpenAI's ChatGPT, is crowding the media, AI is not just OpenAI or Anthropic's Claude.
“There are interesting offline large language models (LLMs) that do not tap on cloud-based AIs and, as such, provide users with data privacy while leveraging the benefits of AI and LLMs,” he opines. “For example, GPT4all and LM Studio are two such publicly available options.”
Phan posits that businesses can select LLMs that offer commercial use licenses and those that can run adequately off their Macs or PCs.
“With such LLMs, businesses, especially bootstrapped ones, can harness the benefits of AI without paying for increasing usage costs as they scale their usage and businesses.”
Dr Seamus Phan
He suggests AI LLMs can help line managers of businesses in finance, marketing, manufacturing, and logistics better grapple with possibilities while always leaving the final line for human users to discern and decide. “Such LLMs can also work with local disk documents, analysing them and summarising them for quick access and use,” Phan continues.
Integrating old and new tech
Millions of “legacy” lines of code run today’s business environments. According to a London Research survey of 200 insurance executives in Europe, legacy technology is cited by 45% of respondents as the single biggest barrier to the adoption of digital tools and new ways of working among insurance companies.
But replacing legacy systems is not as simple as flicking a switch. The tight-knit hardware, software and processes make modernisation a challenge, and that’s oversimplifying it!
Irecki says the first step should be to assess the existing systems, adhere to industry standards and ensure open technologies for broader compatibility, for instance, methods of connectivity through APIs, database queries and more.
He goes on to add the next step of the integration process would be to build, deploy, and manage. Regularly update and validate integrations to maintain smooth operations, ensuring that digital platforms remain flexible and adaptive to evolving technological landscapes.
Dr Phan suggests using common denominators of data and adopting the "Elon Musk 5-step protocol for successful engineering," which is (1) make requirements less dumb, (2) try and delete part of the process, (3) simplify or optimise, (4) accelerate cycle time, and (5) automate, in this exact order, for all manners of digitalisation, which are essentially also engineering projects.
He goes on to suggest that where there are well-published open-source resources, use them. “Reduce your requirements to what is really required to run your business, rather than imagining you need a lot more, which can impede your implementation and introduce even more possibilities of error.
“The more variables you introduce, the more problems you may face. For legacy systems, it is time to retire them. If not now, when? Look for functional replacements to quickly migrate your data over and get on with running your business,” he continues.
Caution should temper exuberance
Experts however caution that AI cannot be embraced as a panacea to solve all the world’s problems – natural and man-made. It has as much potential to be used for harm as it is for good, as we’ve heard occur in Hong Kong when a Hong Kong-based finance employee transferred HK$200 million following a deepfake video meeting.
This is prompting CIOs, or if the role exists – chief data officers – to reiterate the importance of data governance. It is not just about making sure the data is clean. It is also about making the data secure throughout its lifecycle.
Gartner defines data governance as the specification of decision rights and an accountability framework to ensure the appropriate behaviour in the valuation, creation, consumption and control of data and analytics.
Irecki says effective data governance requires ensuring data quality, consistency, and security across the enterprise. Implementing data lineage tracking, regular monitoring, and adhering to regulations like GDPR and ISO standards are critical steps.
He adds that employing robust data encryption and access control mechanisms can mitigate risks and protect sensitive information, fostering trust and reliability in data management.
Demonstrating ROI amidst unpredictable conditions
Asked how organisations can/should optimise IT spending and demonstrate ROI in a challenging economic environment, Irecki says an effective SaaS management tool enables organisations to monitor and assess application usage effectively, minimising reliance on legacy systems.
“By gaining granular visibility into usage data and understanding associated costs, organisations can implement subscription models based on usage, aligning expenses closely with actual use,” he suggests.
Irecki says this demonstrates ROI through performance metrics, improved operational efficiency, and faster time-to-market. Regularly track and report cost savings and productivity gains, highlighting how integration and automation investments lead to significant returns and financial sustainability.
Dr Phan says the recent Microsoft-CrowdStrike event is a reminder of the importance of data, software, and system resilience. “Businesses may consider bringing some of the cloud-based applications and data back "home" to run them locally and on local networks with open source software and stored for local network collaboration, using the cloud services for data redundancy purposes rather than the centre of data management,” suggests Phan.
“With the likes of LibreOffice, user license costs will reduce dramatically, which immediately provides ROI in terms of zero upfront and usage costs. The caveat is to have IT and line managers send reminders to users to update their software when updates become available. It is a small hassle, but worth the long-term usage costs,” he opines.
Ensure resilience in the face of global uncertainties
Dr Plan acknowledges that in a global situation of sanctions and competition, it has become clear that data must stay home, and preferably, software and hardware too. He ponders what would happen if a business suddenly became crippled because the hardware is disabled remotely through cloud-based services, or cloud-based data becomes unavailable due to data centre breakdowns or worse? He revisits the idea of thinking "data at home first" rather than "cloud-only" or "cloud-first”.
For his part, Irecki suggests organisations develop robust disaster recovery plans and regularly back up data to ensure continuity. “Implementing redundancy and failover mechanisms ensures uninterrupted operations. Regularly review and update business continuity plans to address emerging threats and ensure resilience against various disruptions, maintaining seamless service delivery and operational stability,” he concludes.