Understanding the Evolution of AI
Substantive AI efforts got their start in the 1960s and 1970s but were hampered by a lack of computing power. Expanded storage and processing operations in the late 1980s and 1990s saw the development of AI tools capable of beating chess grandmasters, interpreting speech and recognizing emotions.
Cloud computing kicked AI expectations into high gear, but initial efforts were more hype than help. According to a 2021 study, for example, 66 percent of AI-driven chatbot interactions received a rating of 1 out of 5 stars from customers.
RELATED: See how a modernized digital experience supports hybrid work.
Soon after, industry-specific applications of AI shifted toward improving productivity. According to Lenovo, AI tools are used in manufacturing to reduce downtime by up to 45 percent, in retail to increase profitability by up to 59 percent and in finance to drive down default rates by up to 20 percent.
Widespread adoption, however, has set the stage for a new phase known as hybrid AI. Under this model, AI is brought closer to the data, whether it’s in the cloud or on-premises. These AI systems also offer “a full line of servers with NVIDIA GPUs,” according to The Futurum Group.