Friday, February 7, 2025
BlogOnline work

ChatGPT-4 Powered Bing vs Google Bard First Look Road to Artificial General Intelligence

Chat GPT Reached a hundred million active users in 2 months and became the fastest  growing consumer app in history. This caused  Google to feel an existential threat for the  first time in years. As can give you direct Answers to  search queries without needing to browse a list of   potential results. Potentially killing Google’s ad business which currently accounts for  60% of their revenue in response.  Google declare they called red and invited the co-founders back  to Google.

Sergey Brin started coding again  Microsoft saw this as a once in lifetime opportunity  to take its share back in the search market.  so they invested  in CHATGPT and integrated into bing its long that dead  search engine. This is forcing Google to roll out  LaMDa it’s on chatbot based on a large language model  in order to compete with Microsoft and OpenAI  partnership. 

Welcome to the war for the future of search.  Microsoft has announced the integration of OpenAI GPT-4 into  bing branding it as the NEW BIG.  The  AI enhanced bing has a new chat  option in  the menu bar  alongside  search   which allows users to toggle between the two modes   up on selecting the chat option users are greeted with a  chat  interface that provides three suggestions and a disclaimer which  closely resembles the interface of chatGPT.  In  the search mode We can see results  from bingchat next to  the traditional search results.  I saw some mixed opinions about it.  Some people love it  Some people hate it.  I find it  little bit distracting, but functionally  is is great to have all this information from bing chat.  Next to traditional search screen   The new bing will have access to current information from  the web and it will also clearly State its sources  on the answers.  It gives enabling users to verify the information bing is  capable of handling complex tasks that usually require manual effort  such as generating meal or travel plans and writing  code.

The AI ability to comprehend and respond to natural  language.  Queries also means that bing can handle creative tasks such as writing a rhyming poem or creating a  short story.   Being can engage in a conversational exchange with users  providing a  more human  like research experience.  Google on the other hand responded with Bard  a new experimental conversational service  powered by LaMDa a large language model. Sundar pichai mentioned that bard  is  currently in external testing phase  with  Trusted testers. The word experimental  Chosen specifically to prepare the public for the  potential of  wrong information  or biased responses that may come from Bard.  Bard’s large language model is a lightweight version of LaMDA  that require significantly less computing power.  It’s understandable why they choose a lighter model to start  with.  It has been estimated that cost of operating chatGPT  is roughly  1 to 3  million dollars per month with each query  requiring approximately 3 cent  at Google serving 10 billion queries  per day would result  in an annual cost of 110 billion dollar which would  not be Financially feasible to sustain.  When we  look at the user interface of bard  It looks similar to chatGPT. The result page does  not have sources or citations at least for now to  provide feedback there are  thumbs up and thumbs down button.  The refresh button allows you to rerun your query while  the check it button  takes you to a direct or relevant  Source on  the web related to your question.  Here is a quick comparison test between ChatGPT and Bard’s  responses for the same exact prompt.

Which one do you think did a better job ?  both companies   take a hybrid approach  combining a  traditional search experience with chat.  It makes sense since search is still good at head  queries,  like people, weather,  locations or movies.  The all  time highest traffic for Google search was  During the FIFA World Cup final as people search for  scores. 

Large  language models   such as ChatGPT excel at handling tail queries while also offering incredible  creative opportunities.  It makes sense to have a hybrid approach combining traditional  search with Chad based quick answer.  Some people prefer direct answer a quick response to  what they’re searching for and some people like the research  experience  not the destination but Journey while researching for something they  want to discover new information and new people and that’s  totally fine. 

It’s too soon to determine whether traditional search and chat based  search will eventually merge. Only time will tell everybody asking the same question.  where are we going with all these developments AI arms race between Tech Giants will accelerate further  we will see better and more capable large multi models  Combining not just written texts   but also  images , audio and video like PALM model from  Google.  

In the coming month  the scalability of large language model will be an important  issue given the messive   computational  resources  require.  That  can only be supplied by Tech giants  like  Google or Microsoft.  Another challenge lies in the need to continually and intelligently  update  these models  with the latest information available online  which is constantly changing.  finding cost effective ways to do  this will be an important challenge to overcome. effects on  website  Depend on search traffic  will be catastrophic. 

You will ask for recipe  you won’t need to visit  the recipe  website It’s clear this will happen based on past experiences  such as when Google introduced featured Snippets to search.  Some websites reported a 20% decrease in traffic as a  result.  The former director of AI at Tesla Mentioned Github Copilot  in his tweet saying that  already.  80%,  of code   he is writing on a daily basis had been written  by AI  guess who owns GitHub

Microsoft  it’s hard to call a multibillion-dollar company and Underdog  but Microsoft is giving me that same feeling.  I got When watching  Morocco  reached to semi-finals of the World Cup last year and  I love underdogs. generative capabilities of these models  Will grow rapidly until to a moment when it becomes  self inventing itself.  we know that before 2010 training  compute grew in line with Moore’s Law doubling roughly every  20 months since the Advent of deep learning in the  early 2010  the scaling of training computer has accelerated doubling approximately  every 6 months Based on these facts  it’s safe to say that eventually search engines will transition  into answer engines.

Answer engine will transition into hyper Advance personal  assistants accessible from are mixed reality headsets and augmented reality  glasses.  People will start hiring these models as employees.  They never sleep  never get sick or tired  and they will be able to do knowledge processing work,  better than us sooner  or later in the future  Specialized multi models  that focus on individual domains  or what I like to refer as artificial intelligence  will begin to interact with each other as well as with data from robotics Solutions  operating in both  real world and simulated environments.  This will be the point when discussions about artificial  general intelligence will become much more serious and prominent  Once we reach the AGI  artificial super intelligence level won’t take too long to reach.  Over  the next decade  I believe that cost of intelligence will rapid  decrease  to almost zero leading to  democratization of intelligent and  creative power.  As a knowledge worker I reflect on  the future of my career and  acknowledge   that it may not be relevant in 20 years.  Despite this I remain optimistic and eager for the new   opportunities that will come with the rapid advancement of artificial  intelligence. I’m fully committed to embracing this change and exploring all  that it will offer

Leave a Reply

Your email address will not be published. Required fields are marked *

kms office activator