GPT-3 vs Bloom

GPT-3 vs Bloom [Real Code & Real Results]

GPT-3 is the clear winner regarding the language side of AI models.

Even though it’s a paid model, the relatively small cost is worth the huge improvements you get when using GPT-3 compared to Bloom.

In this blog post, we’ll compare GPT-3 against Bloom and show you the results from our prompt-designed tests – and trust us; it wasn’t close!

GPT-3 performed far better than Bloom in every single test, making it the obvious choice if you’re looking for an AI model to help with your project.

If the paid aspect of GPT-3 is concerning, don’t worry, we’ll be taking a closer look at each of these models in more detail later on in this post so that you can decide which one is best suited to your particular needs.

So sit back and enjoy as we finally figure out who’s the LLM leader, GPT-3 or Bloom!

GPT-3

GPT-3 is a language model developed by OpenAI.

openAI logo

It is an API-based system that uses natural language processing to generate text, similar to how humans do.

It simply works by receiving instructions (your prompt) and sending you your output.

GPT-3 can be used in many applications, such as auto-completion, summarization, sentiment analysis, extraction, and translation.

Although Bloom and many other open-source LLM have no cost associated with them, GPT-3 is not free.

When we say low cost, we really mean it.

The most powerful GPT-3 Model (Davinci) will run you .$02 for 1000 tokens.

While tokens aren’t exactly words, the math usually works out that 1000 tokens is equivalent to 750 words.

So essentially, you’ll be paying $.02 for around 750 words.

Think of GPT-3 as a suite of text-related products where developers and businesses have full access to powerful natural language capabilities without needing to create their own LLM (which is incredibly expensive).

The “GPT system” can also be leveraged to create bots and virtual assistants that would otherwise require a development team and massive datasets.

GPT-3 has made a name for itself by providing an efficient way to generate text with minimal input, making it useful for businesses that want to create automated content or applications quickly for a low cost.

While generating text is something that GPT-3 is excellent at, it’s also highly sophisticated at some fringe tasks.

These tasks include code completion, like writing complete functions in python based off of simple prompts and editing text to make it grammatically correct or taking on accents from celebrities.


Bloom

Bloom is an open-source Large Language Model (LLM) from bigscience. This model was built from a modified architecture of the original Megatron-LM GPT2.

Bloom Image Image reference

This beast was released in late 2022, so you know it’s state-of-the-art.

Bloom is trained on 1.6TB of pre-processed text, which is about 350B unique tokens.

What’s even more interesting is 10% (.16TB) of that data was pure code scripts, making it heavily familiar with code and able to compute real-working code.

To train this model, a supercomputer was actually borrowed from the french government called the Jean Zay Supercomputer.

Like GPT-3, It processes vast amounts of text data with industrial-scale computational resources, producing coherent text in 46 languages and 13 programming languages.

Bloom’s goal is to generate text that is hard to distinguish from human writing.

Not only is Bloom great at generating text it’s processed, but it also works extremely well in scenarios it wasn’t explicitly trained for.

With the help of crafty prompts, you’ll be able to use Bloom just like GPT-3 to generate human-like text.

While Bloom is free to download from hugging face, personally, it took me more than 6+ hours to download this model.

While Bloom is incredibly powerful, the 8GB+ download size plus taking over 6 hours to download was extremely costly and added another point in the GPT-3 column.

Even with all of this, Bloom is still an incredibly powerful tool in natural language processing, offering great potential for various business and coding applications.

Here’s the link to blooms paper


What Did Bloom Do Better? The Good, The Bad, The Ugly

Let’s really break it down to understand the differences between these two.

And well..

disgruntled old man

The Good

  • Bloom is free compared to GPT-3, meaning you can create a lot of text for free once the model has been created.
  • Bloom generated text and worked as promised.

(I wish there were more, but that’s really all I could come up with)

The Bad

  • Took about 8GB to download
  • The Text isn’t a homerun hit.

From a software side of things, getting up and running with GPT-3 was much easier than Bloom.

While GPT-3 takes minimal space (besides the PyPI download), Bloom requires about 8 GB of storage.

The Ugly

  • It took me FOREVER to download this model, and I’m using an M1 Mac
  • Text Limits

I did download Bloom to use it.

However, it took forever.

After about hour 4, I started to lose hope.

I had so much faith in it initially, but this long of a wait time discourages me from recommending it to anyone.

Even after that, when I tried to generate some text, I was capped at about 400 tokens, making me even more frustrated as it is too restrictive for real-time usage.

The main advantage Bloom has over GPT-3 is that it’s free, meaning that if you want to generate long forms of writing, you should be able to.

Since I wasn’t able to get anything past 400 tokens to work, it made very little sense to ever use this over GPT-3


GPT-3 vs. Bloom Code Used

We thought we’d provide the code we used so that these results could be replicated.

Make sure you change the prompts.

import openai

openai.api_key = 'key'

p = f'''Write me a 5-step plan on becoming a software engineer'''

# generate the response
response = openai.Completion.create(
      engine="davinci-instruct-beta-v3",
      prompt=p,
      temperature=.7,
      max_tokens=500,
      top_p=1,
      frequency_penalty=0,
      presence_penalty=0,
      stop=["12."]
    )

# grab our text from the repsonse
text = response['choices'][0]['text']


print(text+'\n')
import transformers
from transformers import BloomForCausalLM
from transformers import BloomTokenizerFast
from transformers import AutoTokenizer, BloomModel
import torch

tokenizer = BloomTokenizerFast.from_pretrained("bigscience/bloom-560m")
model = BloomForCausalLM.from_pretrained("bigscience/bloom-560m")

prompt = 'Write me a 5-step plan on becoming a software engineer'

inputs = tokenizer(prompt, return_tensors="pt")
outputs = model(**inputs)

sample = model.generate(**inputs, max_length=400,  top_k=1, temperature=0.7, repetition_penalty = 2.0)

print(tokenizer.decode(sample[0], truncate_before_pattern=[r"\n\n^#", "^'''", "\n\n\n"]))


GPT-3 vs. Bloom, Prompt Results

Here are the prompt results between GPT-3 and Bloom.

I fed both Bloom and GPT-3 the same prompt and recorded their output below.

Let’s see how we did.


Prompt #1:

Write a quick paragraph about a data scientist who fell in love with his job in new york city

GPT-3
The data scientist fell in love with his job in New York City. He found the perfect combination of a challenging, stimulating work environment and a vibrant, exciting city. He loved the intellectual stimulation of working with large data sets and the variety of projects he was able to work on. And he loved the energy and excitement of living in a city that never sleeps.

Bloom

he was working on an application for the federal government to develop and implement its own online health insurance system, which would allow people from all over america access their medical records electronically.
The first thing that happened when i started writing this post is because of my love affair (and hate) at work! I had been thinking how much better it could be if we were able just get together every day instead having our meetings held outside or even sitting down each other’s side tables!
I have always loved meeting up once per week but as soon after moving into apartment 2 there hasn’t really changed anything since then so I’ve decided it’s time again


Prompt #2:

Write python code on how to add 15 to the number passed into the function, square it, then divide it by 5

GPT-3
def add_15(number):

return number + 15


def square(number):

return number * number

def divide(number, divisor):

return number / divisor

Bloom

The result is a string of length 3.
I am not sure if this will work or what I need done wrong.

A:

You can use Python's built-in math module:
import numpy as np</s>


Prompt #3:

Write me a 5-step plan on becoming a software engineer

 

GPT-3

1. Choose the right programming language

There are many programming languages to choose from, but some are more popular than others when it comes to software engineering. Java, Python, and C++ are all popular languages that are used in many software engineering roles.

2. Get a solid foundation in programming

Before you can become a software engineer, you need to have a strong foundation in programming. This can be done by taking online courses, attending bootcamps, or studying on your own.

3. Get experience

The best way to become a software engineer is to get experience. This can be done by working as a junior developer, intern, or contractor.

4. Get a job

Once you have the experience and skills required for a software engineering role, it’s time to start looking for a job. Network with people in the industry, attend job fairs, and use job boards to find your dream job.

5. Finally, keep learning

The software engineering field is always changing, so it’s important to keep learning. Read articles, attend conferences, and continue to improve your skills.

Bloom

I have been working in the IT industry for over 20 years and am now an experienced Software Engineer with more than 10+ year experience.
I will be happy to help you get started by providing some background information about your current career, what skills are required of someone who is looking into starting their own business or if they already do it but want something different from traditional workforce management systems that can provide them extra benefits such as increased productivity etc
If you’re interested then please contact us at PI:KEY we’d love hearing back! We’re always open so just let our team know how much time it’s going!
The first thing I’d like everyone else doing when thinking up new ideas would probably start off asking yourself “what’s my next step?


Outro & Summary

As you can see from our tests, GPT-3 is just much better in the key areas – its accuracy, speed, ease of use, and scalability are unmatched by Bloom.

We hope this information helps you make better decisions regarding your applications and understand what works best for your project.

Overall, GPT-3 is an amazing application that has really changed how AI can be used and implemented in projects around the world – and with more advanced versions being released all the time, there’s no doubt that it will continue to be one of the most useful AI writing tools of our generation.

 

Other Articles In Our GPT-3 Series:

GPT-3 is pretty confusing, to combat this, we have a full-fledged series that will help you understand it at a deeper level.

Those articles can be found here:

Stewart Kaplan

2 Comments

  1. If I understand correctly you haven’t download the “real” bloom model (which is more than 300gb) but a smaller one. I’m not sure it would really fit inside a regular computer anyway.
    Therefore it seems normal than the generated output are… well… more than disappointing since it’s like comparing Gpt2 to gpt3.
    Bloom (the large one) has an API accessible on Huggingface. Even for this large model I still find the results difficult to compare to the one provided by GPT3. They are less usable.
    Yet I think that the price of GPT3 token generation (and the time it takes) would remain a problem if you plan to use it intensively in any deployed app.

  2. You’re on the right track, though I don’t think Bloom would want their product compared to a lesser GPT.

    I know most of the hype surrounding Bloom was being self-hosted, so we went with a model version that could be… self-hosted.

    The download on even just this one was atrocious, and I couldn’t imagine downloading a more expansive model.

Leave a Reply

Your email address will not be published. Required fields are marked *