What is AI?
Artificial intelligence is the umbrella term for the tools and technologies that make software smart. It is the science of enabling computers to learn, read, write, create, and analyze. In essence, it allows machines to think and learn like humans.
AI has sweeping implications for all facets of society, and higher education is no exception. Below are a few basics about AI to get started.
Machine learning is the primary subset of AI. It allows computers to learn from vast data sets and interactions without a human having to write explicit code. It’s the fundamental technology behind such predictive tasks as Netflix suggesting what you should watch or Amazon recommending what you should buy.
Deep learning is a type of machine learning, and it’s where most of the innovation has happened in the past decade. Artificial neural networks simulate the way the human brain processes information, giving deep-learning machines the ability to see, think, and behave like humans—and more importantly, improve on their own.
Large language models are a type of AI trained on vast amounts of text data to understand and generate human-like language. LLMs can write, summarize, translate, chat, answer questions, play games, and generate software code. OpenAI’s ChatGPT (GPT stands for “generative pretrained transformer”), Google’s Gemini (formerly known as Bard), and Microsoft’s Copilot are examples of LLMs.
Generative AI is a class of AI that creates new content on the basis of what it has learned from existing content. Generative AI can include images, text, video, audio, code, and other media.
Prompting is a technique used when working with generative AI products to create content. It’s like giving directions and refining the responses through conversation. There is an art and a science to prompting. Learning how to create effective prompts is a skill that can unlock value more quickly and effectively.
AI is also “augmented” intelligence. Large language models, such as ChatGPT and Gemini, are aids that augment but do not replace human abilities. They don’t function like traditional software. And they don’t always do what they’re told. Sometimes they do the unexpected. They have inherent flaws. Having a baseline understanding of how LLMs work allows you to discern how to use their outputs.
The art of prompting matters (for now). When ChatGPT first emerged, many people assumed that prompt engineering would be a career path. Not true. It should be looked at as a skill set. Your ability to create effective prompts dictates how much value you can get out of the system. Experts say the best way to interact with your AI tool is to talk to it like an intern in training. Tell it what you want, give examples, describe your expectations for an outcome, and if the first answer doesn’t suffice, instruct it to try again.
Adoption is early, but accelerating. The proliferation of AI tools has become almost impossible to comprehend. Many people aren’t actively doing anything with it yet because they don’t know where to start. According to a 2023 Pew Research poll, the vast majority of Americans, or 90%, say they have heard at least “a little” about artificial intelligence, and only 1 in 3 say they have heard “a lot” about it. Microsoft founder and philanthropist Bill Gates predicted in his Gates Notes blog that by 2025, the use of AI will be common among most Americans.
Read about what AI means to AU in the Spring/Summer 2024 issue of AU Magazine.