Zephyrnet Logo

What CIOs and CTOs should consider before adopting generative AI for application modernization – IBM Blog

Date:

What CIOs and CTOs should consider before adopting generative AI for application modernization – IBM Blog <!—-> <!– –>



Implementing generative AI can seem like a chicken-and-egg conundrum. In a recent IBM Institute for Business Value survey, 64% of CEOs said they needed to modernize apps before they could use generative AI. But simultaneously, generative AI has the power to transform the process of application modernization through code reverse engineering, code generation, code conversion from one language to another, defining modernization workflow and other automated processes. Here’s how CTOs and CIOs can evaluate their technology and data estates, assess the opportunity and chart a path forward.

CIOs and CTOs need to:

  • Evaluate their organization’s level of hybrid cloud mastery as a bedrock strategy for effective implementation of generative AI
  • Assess the organizational obstacles and costs of implementation and of maintaining the status quo
  • Weigh the costs and benefits of using general-purpose large models versus tuning smaller ones
  • Assess factors and costs related to data availability, governance, security and sustainability
  • Work with HR to put people at the center of your generative AI strategy

Hybrid cloud accelerates generative AI adoption

For the last decade, IBM has championed a hybrid cloud strategy to underpin scalable AI-driven innovation, productivity and efficiency. From our perspective, the debate over architecture is over. Organizations that have mastered hybrid cloud are well positioned to implement generative AI across the organization. Hybrid cloud allows them to take advantage of powerful open-source large language models (LLMs), use public data and computing resources to train their own models and securely fine-tune their models while keeping their proprietary insights private. Along with adding enormous value to customer and employee experience, HR and customer service functions, generative AI on hybrid cloud gives CIOs and CTOs exceptional agility to automate IT operations and modernize applications, potentially eliminating their technical debt and enabling truly continuous modernization.

The business context

Even for CIOs and CTOs who have committed to hybrid cloud, organizational obstacles to modernization remain. First, technology leaders need to estimate the full financial impact of modernization (versus the cost of not modernizing) across the organization. They need to champion modernization as a business initiative, not an IT project. Leaders must also address the expertise gap by prioritizing talent development and get cultural buy-in on modernization as a strategic, future-proofing business investment rather than an operational technology play.

Next, leaders need to understand the business value generative AI can bring to modernization to understand where they should invest. In the experience of our IBM Consulting teams, organizations that are just getting started on their modernization journeys need perspective on the “art of the possible” when it comes to understanding the benefits and value of AI-driven automation. Organizations that are more advanced on their journeys are looking for clarity around use cases in their industry and assistance to handle unique opportunities.

Prioritizing generative AI use cases

Within IT operations, generative AI use cases include automatic triaging of systems to adhere to service-level objectives; managing, communicating, providing assistance and resolving queries and tickets; and event and anomaly detection and management. It can improve IT automation by building and executing runbooks and helping users transition to new knowledge bases and software. It can also aid in platform engineering, for example by generating DevOps pipelines and middleware automation scripts.

Much more can be said about IT operations as a foundation of modernization. Here, we’ll prioritize discussion of four workflows to which generative AI can be applied.

  • Transformation planning: Generative AI can help define your modernization workflow through summarization, plan creation and generating reference architecture such as Terraform
  • Code reverse engineering: Generative AI facilitates reverse engineering by analyzing code to extract business rules and domain models, generating recommendations to move applications from monolithic architecture to microservices, and identifying refactoring and containerization opportunities and generating refactored code.
  • Code generation: Code generation helps IT leaders overcome challenges related to developer bandwidth and optimizing the skills of a limited talent pool. Highly repetitive and manual tasks can be handled by cloud-native code generation, from short snippets to full functions. Code can be generated for UI design, infrastructure, container platform configuration (such as Red Hat® OpenShift®) and serverless frameworks (such as Knative).
  • Code conversion: Code conversion is essential for retaining and updating mission-critical legacy applications. Generative AI enables automation of this process, for example from COBOL to Java, SOAP to REST and other languages and environments.

CTO/CIOs should consider the quick wins of using generative AI within these functions. Look for relatively discrete and low-risk opportunities to explore proof-of-concept implementations. Start small, test and scale.

Evaluating foundation models

Selecting the right foundation models up front can help you deliver more accurate and efficient outcomes for your enterprise.

The architecture of transformers favors size: larger models produce better results. So, there’s a race in generative AI to build ever-bigger foundation models for ever-broader applications. But while the largest models are powerful, a heavy multibillion-parameter model may not always be the best option for an enterprise. A smaller model that has been fine-tuned for a task can often outperform a large model that hasn’t been fine-tuned for that task. These models can run on top of general-purpose LLMs with minor tuning if the underlying foundation is fit for enterprise use. For example, IBM’s 13-billion parameter Granite foundation models, available in the upcoming release of watsonx.ai, are much smaller than the largest LLMs (which contain hundreds of billions of parameters), but perform well on business-specific tasks such as summarization, question-answering and classification while being much more efficient.

Fit-for-purpose foundation models also enable organizations to automate and accelerate modernization by generating code snippets and application components, along with automating application testing. Drawing on the code models built into watsonx.ai, IBM watsonx Code Assistant can also be used to convert code, for example from COBOL to Java. Within watsonx Code Assistant, developers of all experience levels can phrase requests in plain language and get AI-generated recommendations, or generate code based on existing source code. watsonx.ai also includes access to the StarCoder LLM, trained on openly licensed data from GitHub. Developers can leverage StarCoder to accelerate code generation and increase productivity for application modernization and IT modernization.

Beyond size, when choosing a foundation model, CTOs should also consider the natural languages and programming languages the model supports and the amount of fine-tuning the model needs.

Creating a customized ROI framework

In generative AI, ROI calculation methods are not mature or standardized, nor are comparative benchmarks often available. For enterprise applications, fine tuning, prompt engineering and running compute-intensive workloads require significant investment.

There are four key factors to consider when selecting and deploying a model, which will vary by domain, industry and use case. The first cost factor is the pricing or licensing method. This is evaluated by API usage on public and managed clouds, and by hosting and compute costs on hybrid and private clouds. The second cost factor is development effort, which is higher on hybrid and private clouds and maps closely to the third factor, enterprise data security. Lastly, consider the potential impacts of IP and security risk, which are both lessened towards the hybrid and private ends of the scale.

Data availability and governance factors are also considerations when assessing ROI. Through the watsonx platform, IBM is making significant strides in delivering foundation models that are targeted to the needs of business users: the fit-for-purpose data store provided in watsonx.data, built on an open lakehouse architecture, allows enterprises to personalize their models wherever their workloads reside. The tools in watsonx.governance will also help organizations efficiently drive responsible, transparent and explainable workflows across the business.

As the capabilities and uses of generative AI accelerate, putting numbers to the benefits side of the ROI equation can be a challenge. But it makes sense for CIOs and CTOs to examine the many ways organizations have created business value from traditional AI as a starting point, and to extrapolate potential value from their generative AI test cases and quick wins.

Consider sustainability goals

Whether as part of formal ESG programs or corporate missions, sustainability is more than good ethics—it’s increasingly recognized as better business. Companies with committed, effective sustainability efforts can boost business value with improved shareholder return, revenue growth and profitability. Thus, it’s wise for CTOs to factor sustainability into their generative AI adoption calculus.

Training, tuning and running AI models can leave an enormous carbon footprint. That’s why IBM helps tailor generative AI for the enterprise with foundation models that are trustworthy, portable and energy efficient. Making smaller models and using computer resources more efficiently can greatly reduce expense and carbon emissions. IBM Research is also developing more efficient model training technologies, such as the LiGo algorithm that recycles small models and builds them into larger ones, saving up to 70% of the time, cost and carbon output.

Lead with human resources

Lastly, effectively implementing generative AI depends on skilled and enthusiastic people. Thus, human resource departments should be at the center of your organization’s strategy. Begin by reskilling the HR professionals themselves, who are likely already using AI-driven hiring tools. Next, develop a formal management initiative to communicate where generative AI testing and adoption is underway and provide feedback.

Request an AI strategy briefing

More from Business transformation

IBM Consulting accelerates the future of FinOps collaboration with Apptio

2 min readMaking the right technology investment decisions today is critical to building competitive advantage, fueling innovation and driving ROI. However, dispersed, unreliable data and time-consuming, error prone processes can lead to bloated budgets, ineffective planning and missed opportunities. Organizations need simplified, integrated and automated solutions to help optimize IT spend, improve operations and drive greater financial returns. IBM Consulting is uniquely positioned to provide exceptional FinOps and TBM services, from strategic planning to operating model implementation and managed services. Supported by…

<!—->

Generative AI and resilient hybrid cloud systems

5 min readAs enterprises invest their time and money into digitally transforming their business operations, and move more of their workloads to cloud platforms, their overall systems organically become largely hybrid by design. A hybrid cloud architecture also means too many moving parts and multiple service providers, therefore posing a much bigger challenge when it comes to maintaining highly resilient hybrid cloud systems. The business impact of system outages Let’s look at some data points regarding system resiliency over the last few…

<!—->

How foundation models can help make steel and cement production more sustainable

4 min readHeavy industries, particularly cement, steel and chemicals, are the top greenhouse gas emitting industries, contributing 25% of global CO2 emission. They use high temperature heat in many of their processes that is primarily driven by fossil fuel. Fighting climate change requires lowering heavy industry emissions. However, these industries face tremendous challenges to reduce greenhouse gas emissions. Replacing equipment is not a viable route to reduce emissions, as these industries are capital intensive, with asset lifecycles of over 40 years. They…

<!—->

Skills and expertise: Keys to the generative AI engine

3 min readIBM recently gathered with our services partners, some of the world’s most prominent consultancies and systems integrators, to discuss AI for business at the IBM Global Systems Integrators and Consultancy Exchange event in New York. On the front-line guiding client technology decisions, our services partners are crucial to the IBM Ecosystem and our success. They play a fundamental role in providing our joint clients industry expertise and skills, mapped to our generative AI technology. Throughout the event, there was unanimous…

<!—->

IBM Newsletters

Get our newsletters and topic updates that deliver the latest thought leadership and insights on emerging trends.

Subscribe now More newsletters

spot_img

Latest Intelligence

spot_img