ChatGPT and the future of Artificial Intelligence
ChatGPT‘s beta launch surpassed one million registered users within just one week, and it has caught the attention of virtually all of the technology ecosystem.
I read about it in the New York Times, the Financial Times, and The Atlantic three of the most reputable media sources in my books.
AI generates buzz in the workplace under the assumption that because it is so efficient that it could pose the possibility of threatening human jobs like copywriting, dealing with queries from customers or writing news reports, and even creating legal documents.
In actuality, there’s more nuance to how we consider the potential applications of Large Language Models (LLMs), and generative AI like ChatGPT to the workplace–especially where the reliability of information is paramount.
I had a meeting along with an executive staff of Hebbia AI, a startup that is leading research efforts in LLMs and LLMs, to explore.
When you’re thinking about using ChatGPT as well as any other generative model for work-related processes, it’s crucial to think about a basic limitation:
Generative AI models create responses, but they don’t study sources or reference their work. Therefore, their output is no guarantee of quality.
If you ask a crucial business question, such as “what are the greatest risks in the purchase” when you are involved in an M&A deal, you’ll get a reasonable-sounding answer but not one that comes from an actual source.
Analysts who make suggestions to the managing directors or managing directors who make an opinion to the client need more accurate and reliable methods.
You should be able to probe the information sources and make sure you’re confident about the answers you give to the company’s most crucial questions.
When using generative models for work it is crucial to provide these models with data and relevant information to ensure that answers are based upon research, not on inference.
Semantic search engines use Large Language Models (LLMs) to search and read sources
While one of the uses for LLMs is the development of generative AI like ChatGPT, LLMs can also be used to perform semantic searches such as Hebbia.
In the context of semantic searches, models work as “neural” engineered search systems that retrieve results based on the meaning of the query, instead of simply finding keywords as traditional search engines perform.
For instance, an analyst looking for “trends” will expect to find information about the business competitors, customers, and even growth in the 10-K report.
A semantic search engine such as Hebbia can support this behavior. Keyword searches may yield results on valuations for companies.
Differential AI systems, such as semantic search, are able to find the meaning behind questions and responses and can cite evidence as well as information.
If you are attempting to apply an algorithmic AI to search within working environments, it needs to be used in conjunction with semantic search in order to be utilized with confidence.
Generative models are able to be able to and should use the semantics of search “memory” to overcome the weaknesses of generative models on their own.
In the event that semantics search and Generative AI work together, they can increase the reliability, accuracy as well as ease of keeping research up to the current.
When generative AI combined with semantic searching is utilized together.
it is possible to ensure that the generative AI chatbots responding to the client’s query are providing accurate information about the company and that the responses given to address a crucial customer query are in line with the most recent information about competitors.
Simply put semantic search enhances the application of question-and-answer in AI. It is because of AI due to:
- Models can be pre-loaded with relevant primary sources
- All sources and responses can be properly referenced
- They can be maintained up-to-date by using an index (without model retraining)
Work in the future will include a generative AI at its heart.
Businesses that are able to utilize this technology in a way that is appropriate can significantly enhance their workflows by increasing the efficiency of their decisions as well as reducing the costs of research.
If they are too slow to change or decide to implement the generative AI by itself without the necessary technology necessary to allow it to be suitable to work (e.g. semantic search) will find it difficult to keep up.
If you’re interested in generative AI or semantic search at work contact Hebbia and find out more about their work. I think they add the most important element to this crucial conversation that the world is launching into.
You can follow me on Twitter as well as LinkedIn. You can see some of my other works here.
Technology helps us be more innovative and not less
Whatever creative technology you are studying the tension-filled interaction between the process of human creativity has been the same:
- There is a fear that people without experience or skills are suddenly granted an unearned ability to make.
This leads to the second tension:
- There is a fear that human craftsmen will become replaced with “hacks” as well as machines.
Both tensions have one thing in a common fear.
However, in all instances, such fears have been proven to be unfounded. Sure, many technological advancements have eliminated the requirement for human participation as a component of creating.
Digital photography eliminated the necessity for film processing in a darkroom. Digital editing websites eliminated the need to physically cut and splice films together.
Digital imaging and word processing software eliminates the requirement to manually typeset.
However, for every successful innovation the technology that was developed enhanced a capability, improved an existing process, the creative process more effective, or both.
Thus, even though the fear of being replaced is probably true, however, it can only become real when creators decide on the wrong method to be altered by technology.
Simply put the content creator of today is not less or more skilled or skilled in expressing ideas. ChatGPT and other forms of generative AI just enhance the effectiveness of these activities.
What’s the role of AI in our content creation and marketing teams? In the previous piece, I referred to the first outcomes of ChatGPT (and the pictures created by DALL-E 2 and others) as “impressive kind of.”
Here’s what I mean:
If you look beyond cute concepts such as “create lyrics to country songs with the look that of metal” (yes that’s what I wrote) and take a look at any of the essays or longer articles that ChatGPT regularly produces, you might observe the absence of greater human connection.
It’s clear that ChatGPT is excellent (quite great, actually) in composing sentences that flow logically from one sentence to the next. There’s not a particularly emotional viewpoint or anything that’s akin to creating an entire story.
In simple terms, ChatGPT has the ability to create plots and reveal what happened. However, it’s not good at telling you the story in an approach that causes you to feel something.
AI is not wise.
Wisdom is the human quality of having wisdom, knowledge, experience, discernment, and intelligence to aid in making choices. Unfortunately, AI can’t currently combine these attributes.
So, it won’t be able to evaluate the value of your next distinct white paper or an e-book.
It’s not likely to create the most unique idea for what you can do with your next podcast. It’s not going to write the new business book with the most visionary ideas. But it will create something that matches the style of each of them.
Consider it this way. If you’re writing your next greatest American love story, then you could use ChatGPT to generate a “meh” account of Charleston, South Carolina, from the perspective of your character.
The text it generates will not help readers feel a strong emotional connection with South Carolina.
AI will be the thing we let it be.
In describing the immutability of disruptive innovations business professor and writer Clayton Christensen once shared the story of a professor that dropped a pen and said to his students, “I hate gravity.”
After a few seconds after which he said “But what do you think? that? Gravity does not really care.”
The truth about artificial Intelligence is that it’s already here. Debating whether or not it will not be used is similar to asking photographers to delete their SIM cards.
We’ve used AI to study things via Google to examine the grammar of our writing and look for the perfect image to use for our blog. It is now helping us write the words.
The only issue is the best way for you to utilize it to become professional.
When it comes to the use of artificial intelligence in the creation of content Many of the companies that offer this new technology are not doing themselves any good by portraying the new technology in the context of taking “drudgery” (or “grunt job”) away from the process of creating content or being “magical.”
It is crucial to note that creators do not view processes or capabilities that are evolving as wasteful, drudgery, or even a mystery.
Digital editing of films did not take the art of cutting film and splicing it together. It also allowed creators of content to perform tasks they could not do before.
Digital imaging software doesn’t take away the hassle of mixing paints and opening them in a unique way. It enhanced the process by giving artists an array of color palettes to choose from.
AI opens new possibilities and expands the abilities of authors as well as other creators of content, just as it shuts the doors of other types of content creators.
It will change the way of creating content-written businesses. It will affect every one of us.
What we will do about it is, however, in our hands of us.
Tell the story of your life. Do it right.