Category: CONTRIBUTED CONTENT

  • Achieving unprecedented productivity and innovation with Chatgpt and generative AI

    Achieving unprecedented productivity and innovation with Chatgpt and generative AI

    ChatGPT and generative AI are revolutionizing industries worldwide, with the potential to transform software development and delivery. Although still in its early stages, generative pre-trained transformer (GPT) technology offers significant advantages when combined with causal AI. This pairing enhances the impact and safer use of ChatGPT and other generative AI technologies.

    ChatGPT, an AI chatbot developed by OpenAI, launched in November 2022, captivating boardrooms and households globally. GPT technology and large language models (LLMs) that power it, hold immense potential for tasks like enhancing customer service and increasing employee productivity.

    Zooming in on the UAE, Dubai’s Road and Transport Authority (RTA) has also announced that it will start using AI in 2031 in a bid to contribute towards the Emirate’s ambition to become the world’s smartest city. The new technology will be used for road maintenance, monitoring public transportation, managing traffic, and much more.

    Additionally, Dubai’s government has set forth laws and regulations for autonomous vehicles with the aim of deploying 4,000 autonomous vehicles by 2030. The city’s goals for smart mobility are anticipated to be advanced by the new law, which is also expected to draw investment and create a regulatory environment that makes it easier to use artificial intelligence (AI) in the mobility transport industry.

    Similarly, at Dynatrace, we’re exploring numerous ways to leverage GPTs for accelerated innovation and improved team productivity. For instance, at our annual user conference, Perform, in February 2023, we demonstrated how people can use natural language to query our data lakehouse, showcasing the potential of GPT technology to advance “information democracy.” We have only begun to scratch the surface of these opportunities as the technology is still in its early stages.

    Software development and delivery stand as critical areas where GPT technology, such as ChatGPT, excels. It aids DevOps and platform engineering teams in writing code snippets by drawing on information from software libraries. Moreover, it expedites problem resolution in custom code by providing root-cause context to GPTs, enabling auto-generated remediation.

    These examples represent significant improvements over time-consuming manual processes, such as writing repetitive code or scouring countless Stack Overflow pages for answers. GPTs also facilitate rapid onboarding of team members to new development platforms and toolsets by allowing them to ask questions and receive immediate solutions.

    While employing GPTs to accelerate software development, organizations must establish safeguards to protect intellectual property (IP) and data privacy. Site reliability engineers (SREs) and privacy teams should ensure that teams recognize and respect IP rights when sharing code with GPTs. It is equally crucial to prevent inadvertent sharing of IP or confidential data through technologies like ChatGPT, especially when utilizing repositories like GitHub.

    To fully understand the risks associated with GPTs and generative AI, organizations must recognize that LLM-based generative AI is susceptible to error and manipulation. The accuracy and quality of publicly available information and input drawn by these systems can be untrustworthy or biased. Engineering teams must verify the code they receive from GPTs to mitigate risks to software reliability, performance, compliance, and security.

    Prompt engineering plays a vital role in harnessing the power of GPT tools. Developers provide code and comments as context, which DevOps, SRE, and platform engineering teams feed into generative AI systems. To achieve precise results, prompt engineering requires trustworthy and actionable input. Causal AI, another form of artificial intelligence, plays a crucial role in achieving this precision by drawing insights from technology ecosystems and software lifecycles. Combining causal AI with GPTs empowers teams to automate analytics, explore code impact, and retain full context.

    By integrating GPTs into the Dynatrace unified observability and security platform, organizations can enhance the impact and value of ChatGPT and related technologies. Natural language queries combined with causal AI-powered answers provide accurate context, improving the precision and actionability of GPT’s proposals. This integration enables DevOps and platform teams to receive precise recommendations for solutions to optimize application response time and automate workflow scaling.

    Generative AI and causal AI complement each other, enhancing the impact of GPT technology. However, organizations must understand that ChatGPT and similar technologies alone do not provide solutions. The quality, depth, and precision of information and context fed into these systems directly affect their proposals.

    To maximize the impact of generative AI, organizations should combine it with causal AI, ensuring they receive reliable and meaningful suggestions. This approach enables deterministic and repeatable automation through causal AI and provides rich context to unlock the full potential of GPTs in software delivery and productivity use cases.

    By addressing security and privacy concerns, organizations can leverage automated prompt engineering to feed GPTs with real-time data and causal AI-powered context. This allows GPTs to drive productivity with suitable and meaningful suggestions. Additionally, combining causal AI and generative AI enables DevOps and platform engineering teams to verify the output of generative AI, ensuring reliability and security in their code. Intelligent automation further facilitates the execution of reliable and secure code.

    This journey towards combined causal AI and generative AI empowers organizations to achieve significant productivity gains and accelerate the speed of software innovation, creating a lasting competitive advantage.

  • Leveraging AI as a catalyst for business evolution

    Leveraging AI as a catalyst for business evolution

    As the world becomes increasingly digital, businesses are constantly searching for innovative solutions to drive growth, enhance customer experiences, and stay ahead of the competition. Artificial Intelligence (AI) has emerged as a game-changer, offering immense potential to transform industries and drive growth, with its role becoming more critical as businesses strive to meet the growing demands of their customers and navigate the complexities of the digital era.

    The potential of AI to revolutionize business processes, generate actionable insights, and deliver personalized experiences is immense. At IBM, we have been at the forefront of AI research and development, and recently, have just announced the launch of IBM watsonx, our enterprise-ready AI and data platform comprised of three components to help businesses accelerate and scale AI – the watsonx.ai studio, the watsonx.data data store built on an open lakehouse architecture, and the upcoming watsonx.governance toolkit.

    Harnessing the power of AI however, requires a comprehensive understanding of its capabilities and a strategic approach to implementation.

    Tapping into Generative AI:

    In today’s competitive business landscape, organizations must distinguish themselves and provide personalized experiences to their customers. Generative AI and machine learning represent a new era of technological advancement that has the potential to revolutionize how businesses operate. They empower businesses to build customized models suited to their unique requirements, enabling them to stand out in the market and gain a competitive edge.

    A lot of onuses also rests on advancing AI skills and expertise. IBM is collaborating with Mohamed bin Zayed University of Artificial Intelligence (MBZUAI) launching an AI Center of Excellence that aims to advance collaboration to improve the adoption of AI technology and help drive sustainability. The Center will develop carbon neutral solutions to existing energy supplies, combat climate change, and further natural language processing (NLP) for Arabic dialects.

    Scaling AI Workloads:

    Scaling AI workloads is a critical challenge that organizations face when adopting AI solutions. The ability to handle large volumes of data, process complex algorithms, and deploy AI models at scale is essential for achieving transformative business outcomes. Watsonx.data addresses this by offering a purpose-built data store built on an open lakehouse architecture. It enables easy data access across cloud and on-premises, empowering non-technical users with self-service access to reliable data. It ensures compliance through centralized governance and policy enforcement, providing a seamless experience for all users involved.

    The importance of data cannot be overstated in the AI era. Organizations must invest in data infrastructure, talent development, and cross-functional collaboration to ensure the successful scaling of AI initiatives. By democratizing data access, fostering a data-driven culture, and empowering employees with AI tools and capabilities, organizations can accelerate the adoption and impact of AI across the enterprise.

    Responsible and Transparent AI Workflows:

    As businesses embark on the journey of AI transformation, it becomes crucial to address the ethical and governance aspects of AI deployment. Governance and ethics play a pivotal role in ensuring the fairness, impartiality, and alignment of AI systems with societal values. Given the potential impact on stakeholders, regulatory compliance, and ethical considerations, it is imperative for businesses to establish well-defined guidelines and policies for AI development and deployment. They must also proactively address concerns regarding privacy, security, and the ethical implications of AI deployment.

    Responsible AI workflows involve designing AI systems that are explainable, accountable, and auditable. Our forthcoming toolkit, watsonx.governance, will enable AI workflows to be built with responsibility, transparency and explainability. AI governance should be a foundational principle, and we urge our customers to embrace governance from the outset of their AI initiatives.

    What our clients have to say:

    “In line with etisalat by e&’s strategy to deploy cutting-edge technologies that will enhance customers’ lives, we look forward to collaborating with IBM once again and leveraging its watsonx.ai technology to deliver a more personalised experience to our customers,” Khalid Murshed, Chief Technology and Information Officer, etisalat by e&. “By utilising IBM’s generative AI capabilities, etisalat by e& can unlock creative possibilities, simplify business processes, and become a digital services incubator, positioning us for long-term growth and success.”

    In the healthcare sector for example, organizations globally face challenges such as reducing costs while improving patient care. Billa Bhandari, the Founder of VitalProbe Inc. shares, “With IBM’s watsonx.ai, we at VitalProbe Inc. are very excited to provide our customers with predictive models to identify health risks and convert reams of health data into usable information for clinicians to make informed decisions and improve healthcare outcomes.”

    The successful adoption of AI requires a mindset shift, cross-functional collaboration, and continuous learning to navigate the evolving AI landscape and stay ahead of the curve. As we look to the future, we envision expanding the use cases of enterprise foundation models beyond natural language processing (NLP). We are determined to operationalize 100B+ parameter models for bespoke, targeted applications, empowering organizations to create tailored AI solutions that drive real business value. Moreover, we will continue to enhance our AI governance capabilities, ensuring organizations can navigate regulatory landscapes and deploy AI responsibly.

  • Facilities management in action – Best ways to optimise service operations

    Facilities management in action – Best ways to optimise service operations

    According to Fortune Business Insights, the global facility management market is projected to grow from $1,260.36 billion in 2022 to $1,856.44 billion by 2029, at a compound annual growth rate (CAGR) of 5.7% in the forecast period. To take advantage of this growing market demand, facilities management (FM) companies must effectively respond to several trends, challenges and macro-environment changes that have significantly impacted the industry.

    First, in the new age of hybrid working, they are having to deal with a more distributed workforce. This model forces facility managers to be more strategic about managing and maintaining office space, building infrastructure, and investing in technology that supports hybrid collaboration. While many companies are scaling back their headquarters operations, others are increasing the number of satellite offices closer to where people live to encourage in-person collaboration. That’s creating new demands and challenges for FM companies.

    Second, customers are becoming more demanding and are increasingly looking for outcome-based contracts that provide defined end results and guaranteed solutions to problems rather than traditional ones that simply deliver a product, or a service.

    Third, due to an aging workforce and resource shortages, many facilities management teams lack the workers and skills they need. Subcontractors can help to fill in gaps, yet over-dependency can reduce profit margins and result in inefficiencies. It can also hurt customer satisfaction if subcontractors do not deliver the same quality of service as the organisation’s own field engineers, who ultimately serve as brand ambassadors.

    Fourth, there is an ongoing drive from facilities management companies to reduce the environmental emissions and waste generated in the buildings they look after today. According to research by Frost & Sullivan, 40% of CO2 emissions, 75% of electricity consumption and around 35% of waste production comes from our creation and use of buildings. These sobering statistics underline why we can expect, globally, to see a far greater enforcement of sustainability policies. This is not just about legislation. More and more, companies who demonstrate credible Environmental, Social and Governance (ESG) performance are being commercially rewarded – winning tenders and business contracts in preference to competitors who, from a sustainable practice perspective, are lagging behind.

    Finding a solution

    Given all the above, we see a growing list of challenges ramping up for facilities management companies across the sector, the industry is increasingly coming to terms with these and addressing them through a series of interlinked solutions.

    In terms of hybrid working, facilities management companies need to come to terms with the variability in demand for services. With peaks and valleys in workplace utilisation, its no longer economical to provide services on a scheduled basis. Instead, FM companies must migrate to a more on-demand service model.

    One way they can help support building efficiency is through predictive maintenance. By integrating IoT, data analytics and AI/ML with a field service management software platform, FM companies can anticipate service issues, avoid unnecessary onsite visits, and deliver a proactive experience. These same technologies drive improvements in sustainability and building efficiency, and, as a result, profitability.

    With regard to outcome-based contracts, full asset lifecycle support is one approach that offers a positive route forward. Whereas traditional contracts are transactional in nature and focus on service delivered over a limited time frame, outcome-based performance contacts concentrate on value and outcome delivered over an asset’s lifecycle. For example, maintaining the temperate of an office building at a comfortable 72 degrees, whenever the building is occupied.

    To deliver on this promise, FM companies need to be able to configure contracts to account for all the service tasks, resources, and processes that make the outcome possible over the asset’s lifecycle.

    To address skills shortages, FM companies can drive customer self-service. That starts by providing customers with the ability to book, cancel and reschedule appointments themselves. Building on this, FM companies can empower customers to search FAQs and access user guides, or AI-powered virtual agents, to fix issues themselves.

    Finally, in terms of sustainability, we are witnessing facilities management providers looking to cloud-based technology to help them understand whole-life costs and manage and model complex projects and built assets in the most sustainable way.

    We are also seeing greater use of sensors to provide real-time data to ensure heating, ventilation and air conditioning (HVAC) systems are operating effectively and efficiently, minimising the use of energy whilst detecting and addressing any system degradation.  Sensing also underpins remote resolution, allowing action to be taken before a fault occurs, often preventing the need for a service visit, and enhancing sustainability further.

    Looking ahead 

    The facilities management industry continues to grow dynamically today but for those delivering services, shrinking real estate footprints, the drive towards sustainability, and changing customer expectations can all affect the bottom line. Fortunately, we are seeing tools coming on line that can address all these challenges and thereby help ensure that companies across the sector can forge ahead positively and achieve the growth that is projected for them.

  • The Metaverse and You

    The Metaverse and You

    There’s a picture that pops instantly into our minds when we hear the word ‘Metaverse’: ‘avatars’ of you and your friends in a futuristic setting, almost like new-age versions of the age-old SIMS videogame that was hugely popular in the early 2000s. And while the metaverse undoubtedly feels like an advanced version of a video game, the possibilities of what it holds far surpasses our imagination. A future where the metaverse is all-pervasive may not happen overnight, but a gradual yet determined collection of digital changes and experiences could certainly take us to the next realm.

    UAE LEADING THE RACE

    Dubai’s pioneering metaverse strategy aims to add US$4 billion into the city’s economy while adding 40,000 virtual jobs by 2030. By 2035, the estimated economic contribution of the metaverse to the annual GDP of UAE could be between US$8.8 billion to US$16.7 billion.

    A key factor to ensuring the metaverse is adopted is to have an environment that supports and enables its quick adoption. The UAE is the forefront of leading the world into the limitless possibilities this new technology holds. With a sound Web3 strategy, a Minister of AI, and a Digital Transformation Ministry, UAE hopes to push forward the growth of a digital economy that contributes 20 percent of the gross non-oil national economy.

    While few other countries can boast of the UAE’s vision in this regard, many countries are actively and heavily investing in the metaverse. In the MENA region, Saudi Arabia, Jordan and Egypt are all investing heavily in AR, VR and AI, creating lucrative investment opportunities in the region. With governments pushing strongly to create digitally sound economies, corporations are now taking the initiative to create systems that will benefit the overall economy.

    DIGITAL TWINS TO GAIN POPULARITY

    If we take a step back in time, a quick glance at tech history shows that the adoption of Web1 and Web2 were seemingly different because it relied on adoption by consumers to grow. What will make Web3 different is that we might see a higher adoption rate from enterprises and businesses more than consumers, which means that you may not have to put on your VR goggles to necessarily shop from your local grocery store, but chances are, the backend is handled by store managers donning those same glasses.

    One such emerging technology that is revolutionizing the management of big warehouses and minimizing response time is that of the digital twin. A digital twin is a virtual replica of a physical object, system, or infrastructure that mirrors its physical counterpart and is dynamic in nature. Companies can use digital twins to manage their inventories and drastically shorten the time it takes to flag an issue or process a request.
    As technology distributors, we are privy to many companies who use digital twins to successfully run and manage their systems.

    METAVERSE IN THE PUBLIC EYE

    On the marketing side too, we are witnessing many big players, like Nike and Gucci, quickly jumping on to the metaverse bandwagon with the promise of an enhanced shopping experience. With a Gartner report suggesting that 25% of the population will spend at least one hour a day in the metaverse for activities ranging from shipping to working to social media and entertainment, it opens a whole new avenue for brands to connect with their audiences. Several companies already use VR to help you experience their product in the comfort of your home, like furniture retailers who help you find the perfect spot for a couch or luxury watchmakers who use AR to try on different kinds of watches to help you pick what would look best on your wrist.

    Many technology companies too are warming up to the idea of actively incorporating AR and VR into their offerings. We at Redington will be taking out partners to the metaverse this month during Redington Re-Imagine 2023 where we are bringing together the best minds in technology to discuss everything from quantum physics to future materials. By engaging with the technology community and generating interesting conversations about the future, we can come together in creating and reimagining a better tomorrow, fuelled by new and emerging technologies.

    And while challenges will exist, especially in terms of privacy, the metaverse can offer endless possibilities particularly for enterprises. As the UAE races ahead in the fourth industrial revolution, it will be built on the strength of AI, 5G, Robotics and Twin Technology, ensuring fast-paced growth in the digital and physical worlds.

  • Tackling the MENA region’s data skills shortage requires an intelligent approach

    Tackling the MENA region’s data skills shortage requires an intelligent approach

    Organisations across every industry need to access and work with data and analytics to stay competitive, yet the demand for data skills far outstrips supply in the MENA region. A recent survey by PwC Middle East found that 46% of respondents in the UAE and 58% in Saudi Arabia said there is a skills shortage, this includes everything from basic Microsoft to cloud computing. In addition, respondents to a McKinsey global survey on future workforce needs say that solving the data analytics skills shortage is a priority focus area.

    For many business leaders, analytics can be the difference between making informed real-time decisions for the business and the risk of being surpassed by competitors who have prioritised digital skills such as data literacy. The unique selling proposition of the world’s most profitable companies is their ability to collect, analyse and act on their stored data at scale, and use it to make better, more effective decisions.

    Any organisation wishing to emulate these analytically mature businesses and achieve success using their own data are not only being held back by a shortage of expert data scientists but the misconception that hiring more data scientists is the only solution to delivering these insights. They forget that the most significant data-driven insights in history were not delivered by data scientists – but by subject matter experts who had hands-on experience of solving.

    So, how can businesses in MENA change their approach to hiring and managing talent – reducing their focus on simply recruiting data professionals and homing in on upskilling the existing workforce to deliver data insights?

    Step 1: Discover and enable your in-department experts

    Global research conducted by Gartner indicates that formal inclusion of data literacy in over 80% of data and analytics strategies will become imperative for driving business value by 2023. This means that success is linked to how well employees can “speak data”. Knowledge workers are not only the missing link between data and data scientists but also know where the gold is buried. Digitally enabled knowledge workers bring nearly immediate business impact. But data is everywhere. For these new data strategies to deliver across all departments, they must involve developing these in-department experts through cross-functional learning resources and engagement to ensure they thrive and bring the unique knowledge and context they possess to the forefront of the decision-making process.

    Part of achieving this goal and delivering the data skills that businesses need today is assessing what resources are contained within each department and how they can be put to best use.

    Companies wishing to overcome the data skill shortage must therefore focus on empowering their existing workers. Finding and generating data-driven insights cannot be the remit of any single team, but a mission approached by everyone in the business.

    Step 2: Create a new strategy incorporating these hidden experts

    Most data scientists do not have the time or capacity to solve every problem they are given. A more efficient use of data experts is to have them provide leadership on data strategies, while in-department experts – knowledge workers who understand the business context and possess diverse perspectives – provide them with support and input on strategic decisions. According to Gartner research, more than one billion knowledge workers were in-role globally in 2019, but they remain critically underused.

    Part of the problem is that most data analytics teams are heavily (or exclusively) made up of data scientists. This gap often leads to a narrow understanding of any given problem or solution. Organisations need to bring their in-department knowledge workers into the analytic journey. They can do so by developing a culture that enables and facilitates the communication of data.bDeveloping this new culture necessitates establishing a sense of awareness and enthusiasm among people, providing the requisite training and upskilling and ensuring the successful implementation of the new culture.

    Step 3: Develop a continuous cycle of learning and improvement

    While companies often look outwards for technical solutions, the key component to successfully developing a broader data-literate workforce comes from within. Tackling the MENA region’s data skills shortage is a journey. To build a more data-literate workforce, companies must prioritise encouraging all data workers to improve their skills, rather than only providing specialised technologies for data scientists. This effort should be coupled with enjoyable activities like Datathons, which challenge teams to solve problems and learn collaboratively. Creating a safe environment for failure is also crucial; using sample datasets and virtual environments to test ideas can help employees navigate challenges.

    By prioritising this, companies can address their data skills shortages and leverage existing experts to deliver insights faster, benefiting the organisation as a whole. The Smart Dubai Data Initiative has set a leading example in the UAE by establishing advanced governance frameworks to manage data as a strategic asset and fostering collaboration between the government and private sectors. Any company aiming to tackle their data skills shortage must focus on leveraging such initiatives, upskilling and enabling the experts they already employ to support current data science teams.

  • Right solutions, strategy and insights

    Right solutions, strategy and insights

    Over the past year, IBM has reimagined how people engage with partners. For example, in October 2022, IBM revamped its approach to skilling that, for the first time, gave all registered partners access to the same training and enablement as IBM sellers at no cost to them. In January, a new partner program was launched, IBM Partner Plus, to help partners deepen their technical expertise, accelerate time to market and win with clients.

    IBM’s approach has improved how they support and co-create with partners by providing unprecedented access to IBM resources and incentives to help them unlock greater opportunities in MEA and worldwide. For example, following the introduction of IBM watsonx – a new AI and data platform – at the Think conference in Orlando, launched training and skilling programs for IBM partners to ensure they are knowledgeable and prepared to implement the new technology with their clients when it is available later this year.

    IBM Partner Plus program designed to fuel growth 

    IBM is indeed putting partners in control of their earning potential. They can start earning on day 1 of joining as an IBM partner, making it easier than ever to scale their growth.

    As part of IBM Partner Plus, current partners can access a simple incentives stack – across software and infrastructure – that provides a clear line of sight into how their business decisions impact the incentives they receive, as well as how to maximize those incentives and deal eligibility. The IBM Partner Portal offers additional visibility into how partners are scaling through the program tiers and provides a live dashboard that shows how much they are earning.

    Technology enabling new business models and revenue streams

    Hybrid cloud and AI are two of the most transformative technologies of the time, and they want their partners to be the best in the industry at enabling clients to capitalize on their potential. The combination of IBM technology, such as an open hybrid cloud platform, with Red Hat OpenShift and IBM CloudPaks, along with the specialization and expertise of partners, is a powerful combination to help achieve differentiation in the market.

    IBM watsonx is a recent example. It’s a new AI and data platform that will enable enterprises to scale and accelerate the impact of the most advanced AI with trusted data. The platform consists of three components to help enterprises unlock business value derived from how adaptable an AI model can be to their unique data and domain knowledge. They heard a great deal of excitement for IBM watsonx from partners who are, unsurprisingly, reporting unprecedented interest in AI from their clients.

    Broader market opportunities

    IBM Partner Plus offers a single, integrated experience, consistent for all partner types. Whether they are ISVs, MSPs, system integrators, resellers, GSIs, or others – they are putting growth potential into their hands. Each partner can deepen their technical expertise to receive benefits, resources, and incentives that can help them win with clients.

    In the new program, the more you learn, the more you earn, and benefits relevant to each partner are unlocked at each tier.

    Driving partner collaboration in the hybrid cloud era

    The success is dependent upon harnessing the reach and capability of the partners by making them a true extension of IBM. A thriving ecosystem is about leveraging joint strengths to solve today’s most complex challenges, and their collective clients are already experiencing the benefits of this approach. The reach and scale of the IBM Ecosystem are helping bring client solutions to market 4x faster and driving exponential growth.

    Through collaboration, they are enabling partners to tap into new opportunities by helping them meet clients wherever they may be on their hybrid Cloud journeys. Last fall, they announced a program in the US that enables IBM ecosystem partners to resell eligible IBM software solutions through the AWS Marketplace. It’s a good example of how helping foster new opportunities for partners, create new avenues of business, and connect several parts of the ecosystem that might not have worked together in the past.

    Identifying potential partners

    The IBM Ecosystem includes tens of thousands of partners with a shared vision for the future of hybrid cloud and AI. Partners are committed to deepening their technical expertise and understanding clients’ needs, allowing them to leverage joint strengths to deliver value. They work with partners in whatever way they choose to engage with IBM, whether they build on IBM’s hybrid cloud and AI technology, service, sell it, or do a combination of these activities.

    IBM Partner Plus makes it easier than ever for new partners to join and transact, and for existing partners to expand their offerings. There is an addition of 2,000 new registered partners globally since January, and the goal is to continue to scale and grow the ecosystem with new skilled and innovative partners.

  • Treasury Transformation – the trendsetting force

    Treasury Transformation – the trendsetting force

    The treasury operations of a company are crucial, specifically with regard to cash liquidity, which is the most critical aspect. CFOs understand that their treasury teams must manage critical tasks such as accurate forecasting of cash flows, operational efficiency, fraud prevention, compliance, optimizing working capital, and FX risk management to maintain their organisation’s health. The performance of the treasury team directly impacts the bottom line.

    The core areas of the Treasury function can be broadly categorized into the following:

    • Cash Management
    • Liquidity Management
    • Risk Management
    • Capital Markets
    • Corporate Finance

    Challenges in Manual Process:

    As per the survey taken by Kyriba – the leading Treasury Management Solution and later got published in Global Treasurer, they observed that global treasury teams are wasting an average of 4,812 hours per year using traditional spreadsheets to manage their cash, payments and accounting operations, this leads to:

    • Lack of visibility of cash position
    • Redundancy of processes
    • High banking volume
    • Lack of speed and quality of cash forecasting
    • Challenges in settlement of payment in multi currencies
    • Coping up regulatory changes
    • Non-Centralized control and efficiency

    Treasury Transformation

    Automating corporate treasury is becoming increasingly important as businesses look to streamline their operations and reduce costs. Automation can help improve efficiency, reduce risk, and increase visibility into financial operations.

    The Wall Street Journal published a seemingly innocuous article entitled, “Stop Using Excel, Finance Chiefs Tell Staffs,” in which CFOs lamented the time their teams waste on spreadsheets, including pulling data from disparate systems, instead of engaging in more strategic initiatives.

    Kyriba TMS: Transforming Treasury Management

    Treasury management has come a long way since its inception. Gone are the days when finance professionals relied on spreadsheets and manual processes to manage cash and associated risks. Today, modern treasury management systems (TMS) like Kyriba TMS have become instrumental in enabling businesses to efficiently and effectively manage their cash, liquidity, and risks. Kyriba TMS is a leading cloud-based TMS platform that is used by more than 2,000 global organizations. Let us take a closer look at this innovative solution and how it is transforming treasury management.

    What is Kyriba TMS?

    Kyriba TMS is an all-in-one, cloud-based platform that helps organizations optimize their cash and liquidity management, manage financial risk, and automate their financial operations. It provides a suite of modules that support treasury management, payments, risk management, supply chain finance, and working capital optimization. Kyriba TMS integrates with a wide range of systems, including banks, ERPs, and trading platforms.

    Features and Benefits of Kyriba TMS:

    1. Cash and Liquidity Management

    Kyriba TMS provides real-time visibility into an organization’s cash positions across all bank accounts and currencies. The platform’s robust cash forecasting capabilities help businesses make more informed decisions about funding, investment, and liquidity management. Additionally, Kyriba TMS enables organizations to automate their cash management processes, including payments and bank account reconciliation.

    1. Risk Management

    Kyriba TMS provides robust risk management capabilities to help businesses mitigate financial risk. Its treasury risk management module enables organizations to monitor and manage market, liquidity, and credit risk in real time. It also provides tools for creating and managing financial hedging strategies.

    1. Payments

    Kyriba TMS streamlines the payments process by providing a single platform to initiate, approve, and track payments. The platform supports a wide range of payment types, including bank transfers, cheques, ACH, and wire transfers. Businesses can also take advantage of the platform’s built-in fraud detection and prevention tools.

    1. Supply Chain Finance

    Kyriba TMS provides supply chain finance capabilities that allow businesses to extend payment terms to suppliers while improving their working capital. The platform enables organizations to provide early payment options to their suppliers in exchange for discounts.

    1. Working Capital Optimization

    Kyriba TMS provides a range of tools to help businesses optimize their working capital. The platform’s cash visibility and forecasting capabilities enable organizations to identify opportunities to improve cash flow. It also provides tools for managing accounts receivable and accounts payable.

    1. Enhanced Security

    Kyriba TMS offers enhanced security features to protect businesses’ financial data. Kyriba TMS (Treasury Management System) offers a range of enhanced security features to protect sensitive financial data and prevent unauthorized access. The platform employs best-in-class security protocols and technologies to ensure maximum protection for its users. Some of the key security features of Kyriba TMS include:

    • SOC 2 Type 2 compliant
    • Redundant disaster recovery
    • Encryption, authentication and administration
    • Audit trails
    1. Kyriba Connectivity service

    Kyriba’s Connectivity as a Service gives our customers a complete connectivity solution encompassing ERPs, internal financial systems, third-party providers, and over 1,000 ‘out-of-the-box,’ pre-configured, pre-tested connections with financial institutions across the globe. Kyriba’s solution includes application processing interfaces (APIs) delivering real-time integration, payments and reporting.

    Why Choose Kyriba TMS?

    For superior decision-making support across a wide range of corporate activities and risk management techniques, Kyriba offers unmatched cash visibility. Treasury management solutions from Kyriba have a wealth of features and functionality to aid in efficient treasury/cash administration. It is the best method for maximising profitability while managing your working capital. By skilful management of cash, liquidity, payments, and financial risk, the entire company’s liquidity can be optimised.

    The 5 WHYs to choose Kyriba:

    1. Integrated solution focused on Treasury and Finance.
    2. 100% SaaS – Multi-tenant architecture.
    3. Connectivity – Integrated connectivity managed 100% by Kyriba
    4. Global Solution – Product interfaces in 14 languages, international support centres
    5. Security – Unique secure platform, digital signatures, encryption, multi-factor authentication, SSO, SOC 2 Type 2, ISO 27001

    Finesse- Kyriba Partnership

    Finesse is a renowned implementation partner for Kyriba, boasting a team of highly certified consultants who have accomplished over completed over 40 successful implementations across more than 10 countries worldwide. Leveraging their profound treasury expertise and industry know-how, Finesse adopts a meticulous approach to drive treasury transformation and pairs it with Kyriba’s innovative cloud-based treasury and financial management solution platform. Collaborating together, Finesse and Kyriba are able to support clients in optimizing their treasury processes, and enhancing the precision of their cash positions and forecasts.

    As the Treasury Automation helps treasury teams to harness more data and employ more analytics, those teams will become more strategic and capable of meeting their CFOs’ expectations. And companies will enjoy improvements in free cash flow, cash usage for working capital, productivity and efficiency, investing and borrowing performance, and reduced risk exposures.

    Written by Manoj Panicker, AVP, Treasury Transformation

  • How cloud-native functions can help telcos meet sustainability goals 

    How cloud-native functions can help telcos meet sustainability goals 

    A convergence of geopolitical tensions and climate change concerns is ramping up the pressure for operators to become more efficient in everything they do and wherever they do it—from network cores to data centres and beyond.

    Across the world, thoughts of service consolidation and alternative, modern deployment models are clearly gaining traction throughout the industry as decision-makers search for ways to improve efficiency with minimal disruption and incremental cost.

    Indeed, in Heavy Reading’s 2023 5G global survey of network strategists, respondents’ top approach to reducing power consumption focused on moving as many functions as possible to a common infrastructure platform (52%). This was followed by reducing infrastructure footprints and increasing power efficiency with edge computing (45%), and consolidating functions and vendors for tighter energy management and cost efficiency (40%).

    Commenting on the findings, Gabriel Brown, a principal analyst at Heavy Reading, notes that “disaggregation and multi-vendor by nature introduce some inefficiencies, so thinking about how to address this is clearly important.” He goes on to add that consolidating core workloads onto a common cloud platform appears to be “the most consequential move operators can make to reduce energy consumption in the 5G mobile core.” However, the overall spread in responses indicates that operators “will combine multiple approaches in their power reduction strategy.”

    Cloud-Native to the Rescue? 

    In other recent study, Capgemini’s Research Institute Report—Networks on Cloud: A Clear Advantage—claims that almost half of telecom networks’ capacity will be totally cloud-native in the next three to five years.

    The report also indicates that operators will spend $206 million annually on that cloud transformation over the next five years. Organisations getting in early on a shift to cloud-native are likely to realize the most value in terms of economics and environmental sustainability. For the latter, the research suggests that those embracing telco cloud is expected to reduce greenhouse gas emissions by 5% in the next three to five years. Telco cloud is also set to yield sustainability benefits from lower facility emissions (e.g., reduced physical hardware footprints, less power usage, auto-scaling of network on demand, and managing mobile towers’ power consumption using AI and machine learning).

    This is why Cloud-Native Functions (CNF) will increasingly come into play. Or at least they should!

    CNFs are software implementations of a function, or application, traditionally performed on a physical device.

    Purpose-built for moving workloads to cloud-native architectures, the technology can eliminate telcos’ “heavy” legacy virtualization software layers, as well as automate and orchestrate operations for maximum efficiency. All while scaling their networks.

    It also means they can combine multiple functions on a single platform, boosting performance while using less server capacity, less CPU cycles and, therefore, less energy. For example, those running 5G networks can migrate existing network functions to cloud-native versions of the same.

    It is important to note that, with consolidated CNFs, a single command (or API call) can activate multiple network functions, such as domain name server (DNS), gateway-Internet firewall (GiFW) and carrier-grade network address translation (CGNAT). This unlocks a whole new world of flexibility, including scaling workloads up and down ‘on the fly’ whenever needed. An operator could, for example, use CNFs to quickly provide compute and network capacity for a major sports event and then take it down at the end of the tournament. A traditional approach of running functions on dedicated hardware can take several months to set up.

    By contrast, a consolidated CNF platform can be quickly spun up and down as required and its power consumption is kept to a minimum. The principle is the same as when your TV or smartphone switches to energy saving mode when not in use.

  • Creating an all-flash datacentre: the final nail in the coffin for legacy hard disk

    Creating an all-flash datacentre: the final nail in the coffin for legacy hard disk

    Simplicity and innovation are the foundations on which John ‘Coz’ Colgrove started Pure Storage in 2009. He wanted to keep these principles while achieving a vision to take flash storage mainstream and create an all-flash datacentre.

    Innovative vision

    Today there are well-known advantages of flash over legacy disk-based storage. However, back then, the main barriers to implementing were the purchase cost, the cost of operating it, and working out how to integrate it into the data lifecycle.

    Fast Forward a few years and flash has become the obvious choice for any performance- or latency-sensitive workload. Since flash memory was introduced to enterprise storage, it’s been taking on and replacing hard disk tiers one by one. The market conditions mean that the move to an all-flash data centre has now changed from vision to reality.

    There are many reasons why the time for all-flash is today, from dealing with data growth and the cross-over between the cost of flash and disk, to the clear sustainability savings to be made. These are some reasons why flash will wipe out disks for good:

    Growth in unstructured data

    Running out of capacity or having no additional resources is a nightmare for every IT manager. Getting value and insight from every bit of their data — now that’s nirvana. Unstructured data is expected to grow by 10x before 2030, continuing its existing exponential growth. Organisations are going to have to consider how they make use of it all and flash changes the dynamic for big data analytics, data lakes, content repositories, and even backup & recovery.

    Cost of NAND

    The price-per-bit for NAND flash is declining at a rate much faster than nearline hard drives. While today the cost-per-bit of a hard drive is still lower than the densest flash, there are several key trends to pay attention to in 2023. First, all the major tier 1 flash manufacturers are demonstrating significant density increases this year; with over 200 layers of stacked 3D NAND in some cases. This increase in density will translate into better cost efficiency, as well as further improvements in datacentre power and space savings. Second, if analyst predictions hold, NAND prices overall are expected to decline through most of 2023, continuing a trend that began in late 2022.

    Flash as an environmental choice

    Datacentre infrastructure represents over 1% of global energy consumption, and this share is growing. Organisations are examining their carbon emissions much more closely and many need to look at ways to reduce their carbon footprint. Flash has a vital contribution to this reduction in energy use.

    • Unstructured data is complex and power-hungry, and the disk is inefficient and space-hungry. Put them together and you have mounting issues for data centres, bottom lines, and the environment.
    • Power utilisation may not have been a key driver behind moving data sets before, but with the rising unpredictability of energy availability, plus an increased focus in reducing emissions, this becomes an investment in the future we can’t afford to make.
    • As a result of rising energy costs, energy consumption by information technology has become, not just a concern from an environmental standpoint, but now also a significant economic one.
    • Energy efficiency has become a leading factor in enabling flash technology to achieve a lower TCO than disk in the future.

    Higher Performance

    For some organisations, the cost will trump performance in decision-making criteria. In these key ways, the performance advantage of flash over disk can translate into cost benefits.

    • First, because disk performance is low, even in workloads with modest performance requirements, organisations often end up with stranded capacity. While spinning drives may hold 20TB, if the performance of the system taps out at 16TB, they will not derive as effective a cost-per-bit. Flash doesn’t have this issue because performance stays predictable even as utilisation increases, unlike hard drives.
    • Second, any storage environment needs to deliver on resiliency and will have a redundancy strategy to ensure that data loss is not a risk. With flash devices, faster rebuild times mean less bits need to be dedicated to resiliency structures, which equates to better effective cost efficiency.
    • Third, for backup and then recovery following a ransomware incident, the disk can take an inordinate amount of time to restore data. With flash-based systems, faster restore performance means getting your business back online sooner than with legacy disk-based architectures.
    • Fourth, going hand in hand with better resiliency is better reliability. Compared to mechanical disk drives, flash is simply more reliable; devices fail less often leading to fewer replacements and therefore lower costs, less time replacing failed components, lower risk and no skyrocketing maintenance costs at years 4, 5, and beyond.

    The path to all-flash 

    Many organisations are already well on the journey to all-flash; they have deployed flash-based storage platforms for their performance-sensitive workloads where speed delivers business value; these use cases include databases, private cloud deployments and, increasingly, modern analytics environments. Those organisations that have deployed modern all-flash platforms are reaping the benefits of performance, server efficiency, massively simplified operations and also energy efficiency and associated greenhouse gas reductions.

    For organisations who want to start a migration to an all-flash data centre — here are some key considerations. There needs to be a robust strategy that supports business goals and will determine elements including:

    • Analysis of the business needs and how all-flash will support these.
    • Consideration of total cost of ownership and expected return on investment.
    • Consideration of the wider business landscape including regulations, data protection and location.
    • The time needed to scope and test a solution.
    • The time needed for data migration including consideration of the end of life for existing legacy equipment.

    Additionally, the ability to consume all-flash for petabyte-scale unstructured workloads through a subscription model allows users to grow as business needs demand. A flexible cloud-like Storage-as-a-Service (STaaS) model should include stringent Service Level Agreements on everything from administrative support, day-to-day management and energy consumption. This brings even more operational efficiency — further widening the gap to legacy platforms. Migrations can be complex but with the right vendor and partner support, they can be painless. A proof of concept will allow organisations to see the tech in action, validate the benefits and determine the right solution.

    A vision realised: flash is the choice for environmental, workload and cost reasons

    The simple truth is that no one today continues to buy spinning drives because they’re better, but only because they’re cheaper. In almost every way, flash memory is superior to spinning magnetic rust, and while the flash is already better, those advantages are increasing. Flash has become a viable option for everyday workloads, offering enterprises a better way to manage data growth efficiently, reliably and sustainably.

    Where flash was lagging behind disk was in terms of cost, but now the road to the disk/flash crossover point is upon us. This, combined with the power, space, and cooling savings, higher performance, and better reliability of flash will soon result in hard drives becoming the less cost-effective option. The dream of an all-flash data centre is now a reality for forward-looking modern organisations.

  • World Password Day

    World Password Day

    The history of passwords dates back to the Roman Empire. Initially, they were called passcodes, carved into wood, and soldiers passed them around via the active guard to validate soldier and guard movement. They were a shared resource and multiple people could be aware of the current “secret”. Today, the most common storage medium for a password is the human brain. We assign a password to a system or application, recall it when it needs to be used, and remember it each time we change it. Our brains are full of passwords and, often, we forget them, need to share them, and are forced to document them using insecure methods like paper or spreadsheets. These insecure methods for sharing passwords have caused the press to report front-page news articles on data breaches and compelled organizations to educate employees on the insecure methods for password storage and sharing.  A better method to document passwords is needed that is highly secure, documents distributed access, and promotes sharing and collaboration with minimal risk—no matter where the access occurs.

    Morey Haber, Chief Security Officer at BeyondTrust.
    Morey Haber, Chief Security Officer at BeyondTrust.

    In reality, people should not be expected to remember every password they need, nor is it safe to reuse passwords across multiple services and applications. This has spawned an entire market for personal password managers, privileged access management for businesses, and passwordless technology like Microsoft Hello and Apple TouchID.

    Since Covid, we truly have a work from anywhere world, and the cloud is ideal for situations when passwords need to be available outside of the organization, across multiple geographical locations, and when on-premise technology is incapable or cost-prohibitive for meeting business objectives and minimizing risk.

    On Password Management Day, consider the risks of remembering, sharing, documenting, and reusing passwords. Security best practices today have better methods, including password services in the cloud, to minimize the need to remember passwords.

    Joseph Carson, Chief Security Scientist, Delinea.
    Joseph Carson, Chief Security Scientist, Delinea.

    World Password Day serves as a reminder to reflect and think about your password health. If you are anything like me, you are not a fan of passwords – having to frequently change them and choose the next great password that is better, longer, and more unique than the previous one.

    This World Password Day, let us take a moment and think about how we can remove passwords from our lives and into the background while making our digital lives safer. A great place to start is by using a Password Manager.

    A Password Manager will let you know when your password needs to be changed, when it is weak, or when it is reused. Even better, when used in conjunction with multi-factor authentication (MFA), it takes away the tedious take of choosing – and remembering – your next great password.

    Let us use this World Password Day to move passwords out of our lives, into the background, and make our digital world a safer place.