From Binary to Code: A Journey Through Computer Languages

  • Due No due date
  • Points 5
  • Questions 5
  • Time Limit None
  • Allowed Attempts 3

Instructions

Unit 4 From Binary to Code: A Journey Through Computer Languages

Instructions

Welcome to this case study on computer languages and programming! In this engaging narrative, you'll follow two high school students as they explore how computers represent and process data, from binary code to programming languages. As you read, pay attention to how computers store text using ASCII and Unicode, how programming languages work, and the steps involved in turning source code into a running program. You'll also learn about different programming languages and their uses, with a focus on Python. This case study will help you understand the foundations of computer science and programming.

Introduction

Calissa Song tapped her pencil against her notebook, staring at the computer screen in the high school's computer lab. Her dark hair was pulled back in a practical ponytail, and her brow furrowed in concentration. As a sophomore with a passion for technology, she had joined the school's new coding club, but today's challenge had her stumped.

The Binary Bridge: From Bits to Programs

"I don't get it," she said to her friend Eman Guo, who sat at the computer next to her. "How does the computer actually understand what we're typing? When I press the letter 'A' on my keyboard, how does it know what that means?"

Binary code background
seewhatmitchsee / Shutterstock

Eman adjusted his glasses and swiveled his chair to face her. He was a junior who had been coding since middle school and had a knack for explaining complex concepts.

"That's actually a really good question," he said. "It all comes down to binary."

"Binary? You mean all those ones and zeros?" Calissa asked, remembering a lesson from their Computer Science class.

"Exactly," Eman nodded. "Everything in a computer is stored as binary data—just sequences of 0s and 1s. When you press the letter 'A' on your keyboard, the computer doesn't actually store the letter 'A'. It stores a binary code that represents 'A'."

Understanding Binary

Introduction to Binary

Ms. Asadi, their Computer Science teacher, overheard their conversation as she walked by. She was a former software engineer who loved teaching students about the foundations of computing.

"Mind if I join in?" she asked, pulling up a chair. "Eman's right. Computers can only understand binary data. Everything—text, images, videos, programs—has to be converted to binary for a computer to process it."

Calissa looked puzzled. "But how does that work? How can zeros and ones represent letters?"

Bits and Bytes

Ms. Asadi smiled. "Let me show you. The computer's memory is divided into tiny storage locations called bytes. Each byte is made up of eight smaller units called bits."

"Bit stands for binary digit, right?" Eman chimed in.

"Correct," Ms. Asadi said. "A bit can be either 0 or 1—like a tiny switch that's either off or on. Eight bits together make one byte, which is enough to store a single character or a small number."

She grabbed a piece of paper and drew a row of eight boxes.

"Let's say we want to store the letter 'A'. In computers, we use a standard code called ASCII—the American Standard Code for Information Interchange—to represent characters. In ASCII, the letter 'A' is represented by the decimal number 65."

"But computers don't understand decimal numbers directly," Eman added. "They convert that 65 to binary."

"Right again," Ms. Asadi said. "The decimal number 65 in binary is 01000001."

She wrote the binary number in the boxes:

01000001

"So when I type the letter 'A'," Calissa said slowly, working it out, "the computer stores it as 01000001?"

"Exactly!" Ms. Asadi said. "Each character has its own unique binary code. For example, the letter 'B' is 66 in decimal, which is 01000010 in binary."

Calissa's eyes widened with understanding. "So that's how computers can store text—by converting each character to a number, and then converting that number to binary!"

ASCII and Unicode

"You've got it," Ms. Asadi nodded. "But ASCII has limitations. It only defines codes for 128 characters, which is enough for English letters, numbers, and some special characters, but not for other languages or symbols."

"What about languages that don't use the English alphabet?" Calissa asked. Her grandparents were from China, and she knew Chinese had thousands of characters.

"That's where Unicode comes in," Ms. Asadi explained. "Unicode is a more extensive encoding scheme that's compatible with ASCII but can represent characters from virtually all of the world's writing systems. The first 128 Unicode characters are identical to ASCII, but Unicode extends far beyond that, with nearly 150,000 unique symbols and codes."

Eman typed something on his computer and turned the screen toward Calissa. "Look, I just found a Unicode table online. See how many more characters it includes?"

Calissa scrolled through the table, amazed at the variety of symbols and characters from different languages. "This is incredible! So Unicode is like ASCII but much bigger?"

"That's a good way to think about it," Ms. Asadi said. "Unicode has become the standard for character encoding in the computer industry because it's so comprehensive."

Ruminating on Binary

As the bell rang signaling the end of their free period, Calissa gathered her things, her mind buzzing with new understanding.

"Thanks for explaining this," she said to Ms. Asadi and Eman. "I never realized there was so much happening behind the scenes just to display text on a screen."

"That's just the beginning," Ms. Asadi smiled. "Tomorrow, we can talk about how computers represent other types of data, like images and sound."

As they walked out of the computer lab, Eman turned to Calissa. "Want to meet up this weekend to work on our coding project? I think understanding how computers store data will help us write better code."

"Definitely," Calissa nodded. "I'm starting to see how everything connects. It's like learning the alphabet before trying to write a novel."

"Exactly," Eman grinned. "And we're just getting started.

Learning Conversions

The next day, Calissa arrived at the computer lab early, eager to learn more. She found Eman already there, typing away at one of the computers.

"Hey," she said, setting her backpack down. "What are you working on?"

"I'm trying to convert some decimal numbers to binary," Eman replied. "I thought it would help us understand how computers store numerical data."

Calissa pulled up a chair next to him. "Can you show me how to do that?"

Converting Decimals to Binary

"Sure," Eman said. "Let's start with a simple example. Say we want to convert the decimal number 42 to binary."

He grabbed a piece of paper and drew a chart with two rows and eight columns. In the top row, he wrote the powers of two from right to left: 128, 64, 32, 16, 8, 4, 2, 1.

"First, we find the largest power of two that's not greater than our number," Eman explained. "For 42, that's 32. So we put a 1 under 32 in our chart."

He wrote a 1 under the 32 column.

"Then we subtract 32 from 42, which gives us 10. Now we find the largest power of two that's not greater than 10, which is 8. So we put a 1 under 8."

He wrote a 1 under the 8 column.

"Subtracting 8 from 10 gives us 2, so we put a 1 under the 2 column as well. After subtracting 2, we get 0, which means we're done. We fill in all the other positions with 0s."

The completed chart looked like this:

1286432168421

00101010

"So 42 in binary is 00101010," Calissa read from the chart.

"Exactly!" Eman said. "And if we want to convert from binary back to decimal, we just add up the values of all the positions that have a 1. So for 00101010, that's 32 + 8 + 2 = 42."

Beyond Binary Basics
Negative Numbers

Ms. Asadi walked into the lab and smiled when she saw them working together.

"I see you two are getting a head start on today's lesson," she said. "Binary conversions are an important skill for understanding how computers work."

"It's actually pretty straightforward once you get the hang of it," Calissa said. "But I'm wondering—how do computers handle negative numbers or decimal points?"

"Great question," Ms. Asadi said. "For negative numbers, computers use a technique called two's complement. And for numbers with decimal points—what we call real numbers—they use floating-point notation. These are more advanced topics, but the important thing to understand is that computers have ways to encode all types of numbers into binary."

Images and Music

"What about other types of data?" Calissa asked. "Like images or music?"

"Those are also stored as binary data," Ms. Asadi explained. "Digital images are made up of tiny dots called pixels. Each pixel's color is represented by a numeric code, which is stored as binary. Similarly, digital music is broken into small pieces called samples, each converted to binary."

The Power of Binary

"That's why computers are called digital devices," Ms. Asadi nodded. "The term 'digital' refers to anything that uses binary numbers. Digital data is data stored in binary format, and a digital device is any device that works with binary data."

Calissa thought about all the digital devices she used every day—her smartphone, laptop, tablet, smart watch, even the digital display on her microwave.

"It's amazing to think that all those devices are just processing patterns of ones and zeros," she said.

"And yet, from those simple patterns, we get all the complexity of modern computing," Ms. Asadi added. "That's the beauty of computer science—building complex systems from simple foundations."

One More Example

As more students began to file into the lab for class, Eman quickly showed Calissa one more example.

"Let's convert the binary number 10110 back to decimal," he said. "We have 1s in the 2, 4, and 16 positions, so that's 2 + 4 + 16 = 22."

"I think I've got it," Calissa said confidently. "Binary is like the computer's native language, and everything else has to be translated into it."

"That's a perfect way to put it," Ms. Asadi smiled. "And speaking of languages, today we're going to talk about programming languages and how they relate to the binary code that computers actually execute."

Calissa opened her notebook, ready to learn more about the bridge between human instructions and the computer's binary world.

Programming Fundamentals

Programming Languages

The following week, Calissa and Eman met at the local library to work on their coding project. They had decided to create a simple program that would convert text to binary and back again, applying what they had learned about ASCII and Unicode.

"I've been thinking about something Ms. Asadi mentioned," Calissa said as they settled at a table with their laptops. "She said that computers can only understand machine language, but we're writing our program in Python. How does that work?"

Eman opened his laptop. "That's a great question. When we write code in Python or any high-level programming language, it has to be translated into machine language before the computer can execute it."

Interpreters and Compilers

Calissa thought about this for a moment. "So the interpreter is like a translator that sits between our code and the computer's CPU?"

"That's a good way to think about it," Eman said. "The CPU—Central Processing Unit—is the part of the computer that actually runs programs. But it can only understand instructions written in machine language, which are basically binary patterns."

Computer CPU board, close-up
Coleman Yuen / Pearson Education Asia Ltd

"That sounds incredibly tedious to write," Calissa remarked. "Imagine having to write an entire program as a series of ones and zeros!"

"That's why we have programming languages like Python," Eman laughed. "In the early days of computing, programmers did have to write in machine language or something very close to it called assembly language. Assembly language uses short words called mnemonics instead of binary numbers, but it's still very low-level and requires intimate knowledge of how the CPU works."

Calissa opened a new Python file on her laptop. "So when I write something like print('Hello, World!') in Python, the interpreter translates that into machine language instructions that the CPU can understand?"

"Exactly," Eman said. "And the CPU follows what's called the fetch-decode-execute cycle for each instruction. First, it fetches the instruction from memory. Then it decodes the instruction to figure out what operation to perform. Finally, it executes the operation."

Handling Syntax Errors

"That makes sense," Calissa nodded. "But what happens if I make a mistake in my code?"

"If you make a syntax error—like forgetting a parenthesis or misspelling a keyword—the interpreter will catch it and display an error message," Eman explained. "Syntax refers to the rules of the programming language, similar to grammar rules in human languages."

"But unlike human languages, computers aren't forgiving of syntax errors," Calissa noted. "If I say something grammatically incorrect in English, people can usually still understand what I mean. But if I make a syntax error in Python..."

"The interpreter stops and tells you there's an error," Eman finished. "Computers need precise instructions."

Implementing the Text-to-Binary Converter

Calissa typed a simple Python program:

print("Converting text to binary")

text = input("enter some text")

binary result = ""

for char in text: 

    ascii_code=ord(char)

    binary=bin(ascii_code)[2:]

    binary_result+=binary+""

print("Binary representation:",binary_result)

"Let me explain what this code does," she said to Eman. "First, it asks the user to enter some text. Then, for each character in that text, it finds the ASCII code using the ord() function. Next, it converts that ASCII code to binary using the bin() function, which adds '0b' at the beginning to indicate it's a binary number, so we remove that with the [2:] slice. Finally, it adds each binary representation to a result string and prints it out."

"That looks good," Eman said, impressed by how quickly Calissa was picking up Python. "Let's run it and see what happens."

Calissa ran the program and typed "Hello" when prompted. The program displayed:

Binaryrepresentation:10010001100101110110011011001101111

"It works!" Calissa exclaimed. "Each binary number represents a character in 'Hello'."

Adding More Functionality

"Now let's add a function to convert binary back to text," Eman suggested.

They spent the next hour adding to their program, testing it with different inputs, and fixing bugs. As they worked, Calissa gained a deeper understanding of how programming languages serve as a bridge between human-readable instructions and the binary code that computers execute.

"You know," she said as they were packing up to leave, "I'm starting to see why Ms. Asadi said that understanding how computers store and process data is fundamental to becoming a good programmer. It helps you think about what's happening behind the scenes."

"Definitely," Eman agreed. "And once you understand these basics, you can apply them to any programming language you learn in the future."

As they walked out of the library, Calissa felt a sense of accomplishment. What had seemed like abstract concepts just a week ago were now becoming clear, practical tools that she could use to solve real problems.

Additional Programming Languages

Comparing Programming Languages

The next day in Computer Science class, Ms. Asadi announced that they would be exploring different programming languages and their uses.

"While we focus on Python in this course," she explained, "it's important to understand that there are hundreds of programming languages, each with its own strengths and purposes. Understanding the landscape will help you choose the right tool for different tasks in the future."

She displayed a slide showing code examples in different languages, all performing the same simple task: printing "Hello World!" to the screen.

"Let's compare these," Ms. Asadi said. "Notice how the syntax differs between languages."

Calissa studied the examples:

C/C++
//C program to display Hellow World!

#include<stdio.h>

int main(){

   printf("Hellow World);

   return 0;

}
Java
//Java program to display Hello World! 

class HelloWorld{

   public static void main(String args[]){

      System.outprintIn("Hellow World!"):

  }

}
Python
#Python program to display Hello World!

print("Hello World!")
JavaScript
//JavaScript program to display Hello World! 

<script>

   console.log("Hello World!")

</script>

"Wow, the Python version is so much simpler!" Calissa remarked.

"That's one reason why Python is often used as a first programming language," Ms. Asadi nodded. "Its syntax is clean and readable, which makes it easier to learn. But each language has its own advantages and typical use cases."

Language Characteristics and Use Cases

She went on to explain how some languages were designed for specific purposes, while others were general-purpose languages that could be used for a variety of applications.

"For example, FORTRAN was the first high-level programming language, designed in the 1950s for scientific and engineering calculations," Ms. Asadi said. "COBOL was created around the same time for business applications. Today, languages like C and C++ are often used for operating systems and embedded devices, while JavaScript is essential for web development."

Compiled vs Interpreted Languages

Eman raised his hand. "What about the difference between compiled and interpreted languages? You mentioned that Python uses an interpreter, but I know some languages use compilers instead."

"Great question," Ms. Asadi said. "A compiler translates an entire program from a high-level language to machine language all at once, creating a separate executable file. This is how languages like C and C++ work. An interpreter, on the other hand, translates and executes the code line by line, which is how Python works."

She drew a diagram on the board showing the difference:

CompiledLanguage:

SourceCodeCompilerMachineCodeCPUexecutesMachineCode

InterpretedLanguage:

SourceCodeInterpretertranslatesandexecuteslinebylineCPU

"Each approach has its advantages," Ms. Asadi continued. "Compiled programs generally run faster becau+se the translation is done ahead of time. Interpreted languages are often more flexible and portable across different operating systems."

Calissa thought about their text-to-binary converter program. "So when we run our Python program, the interpreter is translating each line to machine language and then immediately executing it?"

"Exactly," Ms. Asadi confirmed. "And if there's a syntax error in your code, the interpreter will stop at that point and display an error message."

Type Systems: Strongly-Typed vs. Loosely-Typed Languages

"What about strongly-typed versus loosely-typed languages?" asked another student. "I've heard those terms before."

"Another excellent question," Ms. Asadi said. "In a strongly-typed language like Java or C++, you have to explicitly declare what type of data a variable will hold—like an integer or a string—and that variable can only hold that type of data. In a loosely-typed language like JavaScript, variables can hold any type of data and can even change types during program execution."

She wrote examples on the board:

//Java(strongly-typed)

int number; //Declare a variable of type integrer

number = 100; //Assign an interger value

//number="hello";//Thiswouldcauseanerror!
//JavaScript(loosely-typed)

let number = 100; //number is an integer

number = "hello"; //Now number is a string, no problem!
Python's Characteristics

"Where does Python fit in?" Calissa asked.

"Python is interesting because it has characteristics of both," Ms. Asadi explained. "It's dynamically typed, meaning you don't have to declare variable types in advance, but it is also strongly typed in the sense that it won't automatically convert between incompatible types without explicit instructions."

As the class continued, they discussed the evolution of programming languages, from early languages like FORTRAN and COBOL to modern languages like Python, JavaScript, and Ruby. Ms. Asadi emphasized that while the syntax and features of languages might differ, the fundamental concepts of programming remained consistent across them.

"The key is to understand the core concepts," she said. "Once you grasp variables, control structures, functions, and data structures in one language, you can apply that knowledge to learn any other language more easily."

Applying Knowledge to Projects

After class, Calissa and Eman stayed behind to ask Ms. Asadi more questions about their project.

"I'm really excited about our text-to-binary converter," Calissa said. "But I'm wondering if we could expand it to handle Unicode characters as well, not just ASCII."

"That's a great idea," Ms. Asadi said. "Python has built-in support for Unicode, so it shouldn't be too difficult to modify your program."

"And maybe we could add a graphical user interface instead of just using the command line," Eman suggested.

"Absolutely," Ms. Asadi nodded. "Python has several libraries for creating GUIs, like Tkinter or PyQt. That would be an excellent next step."

As they left the classroom, Calissa felt a growing confidence in her programming abilities. What had started as curiosity about how computers understand text had blossomed into a deeper understanding of computer science fundamentals and a passion for coding.

"You know," she said to Eman as they walked down the hallway, "a few weeks ago, I couldn't have imagined understanding all this. But now I'm actually excited to learn more about programming languages and how they work."

"That's how it starts," Eman smiled. "First, you learn the basics. Then, before you know it, you're building real programs and solving real problems."

"And it all comes back to those ones and zeros," Calissa mused. "Binary—the universal language of computers."

The Final Project

Planning the Project

The weekend before their coding project was due, Calissa and Eman met at Calissa's house to finalize their text-to-binary converter application. They had decided to expand it into a more comprehensive tool that could convert between text, ASCII/Unicode values, and binary.

"I've been thinking about how to structure our program," Calissa said as they sat at her desk, laptops open. "We should use functions to make the code more organized and reusable."

"Good idea," Eman agreed. "Let's start by defining the main functions we need."

They sketched out a plan for their program:

  1. A function to convert text to ASCII/Unicode values
  2. A function to convert ASCII/Unicode values to binary
  3. A function to convert binary back to ASCII/Unicode values
  4. A function to convert ASCII/Unicode values back to text
  5. A simple menu system to let users choose which conversion they want to perform
Building the Code
Text to ASCII Function

"Let's start coding," Calissa said, opening her Python IDE (Integrated Development Environment). "I'll create the text to ASCII function first."

She began typing:

def text_to_ascii(text):

  """Convert a string of text to a list of ASCII/Unicode values."""

   ascii_values = []

   for char in text: 

      ascii_values.append(ord(char))

return ascii_values

"The ord() function returns the Unicode code point for a given character," she explained to Eman. "For ASCII characters, the Unicode code point is the same as the ASCII value."

ASCII to Binary Function

Eman nodded and started working on the next function:

def ascii_to_binary(ascii_values):

  """Convert a list of ASCII/Unicode values to binary strings."""

   binary values = []

   for value in ascii_values: 

     #Convert to binary and remove the 'Ob' prefix

     binary = bin(value)[2:]

     #Ensure each binary number is 8 bits long

     binary = binary.zfill(8)

     binary_values.append(binary)

   return binary_values

"I added the zfill() method to pad the binary numbers with leading zeros so they're all 8 bits long," Eman explained. "That makes it easier to convert back later."

Binary to ASCII Function

They continued working on the remaining functions:

def binary_to_ascii(binary_values):

  """Convert a list of binary strings to ASCII/Unicode values."""

  ascii_values =[]

  for binary in binary_values: 

     #Convert binary string to integer

     value = int(binary,2)

     ascii_values.append(value)

  return ascii_values


def ascii_to_text(ascii_values):

  """Convert a list of ASCII/Unicode values to text."""

  text =""

  for value in ascii_values:

     text +=chr(value)

  return text 

"The chr() function is the opposite of ord()," Calissa noted. "It converts an ASCII/Unicode value back to a character."

Creating a Simple Menu System

Finally, they created a simple menu system to tie everything together:

defmain():

print("Text <-> Binary Converter")

  print("========================")

print("1. Convert text to binary")

print("2. Convert binary to text")

print("3. Exit")

choice=input("Enter your choice (1-3): ")


    ifchoice=='1':

        text=input("Enter the text to convert: ")

        ascii_values=text_to_ascii(text)

        binary_values=ascii_to_binary(ascii_values)        

        print("\nText: ",text)

        print("ASCII/Unicode values: ",ascii_values)

        print("Binary values: ")

        fori,binaryinenumerate(binary_values):

            print(f"{text[i]} = {ascii_values[i]} = {binary}")
            

    elifchoice=='2':

        binary_input=input("Enter binary values (8 bits each, separated by spaces): ")

        binary_values=binary_input.split()


        try:

            ascii_values=binary_to_ascii(binary_values)

            text=ascii_to_text(ascii_values)     

            print("\nBinary values: ",binary_values)

            print("ASCII/Unicode values: ",ascii_values)

            print("Text: ",text)

        exceptValueError:

            print("Error: Invalid binary input. Please use 8-bit binary numbers.")
          

    elifchoice=='3':

        print("Goodbye!")

        return

    else:

        print("Invalid choice. Please try again.")


    # Recursive call to keep the program running

    print("\n")

    main()

# Start the program

if__name__=="__main__":

main()
Testing the Program

"Let's test it," Calissa said after they finished coding. She ran the program and selected option 1, then entered "Hello, World!".

The program displayed:

Text:Hello,World!

ASCII/Unicodevalues:[72,101,108,108,111,44,32,87,111,114,108,100,33]

Binaryvalues:

H=72=01001000

e=101=01100101

l=108=01101100

l=108=01101100

o=111=01101111

,=44=00101100

  =32=00100000

W=87=01010111

o=111=01101111

r=114=01110010

l=108=01101100

d=100=01100100

!=33=00100001

"It works!" Eman exclaimed. "Now let's try converting binary back to text."

They tested the second option by entering the binary values for "Hi" (01001000 01101001), and the program correctly converted it back to text.

Refining the Program

"This is really cool," Calissa said. "We're actually seeing the whole process—from text to ASCII to binary and back again."

"And we're using all the concepts we learned," Eman added. "Character encoding with ASCII and Unicode, binary representation, and programming fundamentals like functions and control structures."

They spent the next hour refining their program, adding error handling and improving the user interface. By the end of the session, they had a fully functional application that demonstrated the principles of how computers represent and process text data.

"I think we're ready to present this to the class," Calissa said, saving the final version of their code.

"Definitely," Eman agreed. "And the best part is that we really understand what's happening at every step—from the high-level Python code we wrote to the binary data that the computer actually processes."

Presenting the Project

On Monday, Calissa and Eman presented their project to the class. They explained how computers represent text using ASCII and Unicode, demonstrated their program's ability to convert between text and binary, and showed how Python's interpreter translates their high-level code into machine language instructions that the CPU can execute.

"What I find most fascinating," Calissa concluded, "is how we've built this bridge between human language and machine language. Our program takes text that's meaningful to humans and converts it to binary that's meaningful to computers, and then back again."

Ms. Asadi nodded approvingly. "That's exactly what programming is all about—creating that bridge between human intent and computer execution. And understanding how data is represented and processed is fundamental to being an effective programmer."

After class, as they were packing up their things, Eman turned to Calissa. "So, what's next? Any ideas for our next coding project?"

Calissa smiled. "Actually, I've been thinking about creating a simple image converter—to show how computers represent and manipulate images as binary data."

"That sounds challenging," Eman said. "But after what we've learned and what we've built, I think we're ready for it."

Conclusion

As they walked out of the classroom, Calissa reflected on how far she'd come in just a few weeks. What had started as a simple question about how computers understand text had led her on a journey through the foundations of computer science—from binary representation to character encoding to programming languages and execution. And with each step, the mysterious world of computing had become a little less mysterious and a lot more fascinating.

"You know," she said to Eman, "I used to think of computers as these magical black boxes. But now I see them as these incredibly logical systems built on simple principles."

"That's the beauty of computer science," Eman replied. "Complex systems built from simple foundations."

"And it all starts with binary," Calissa added. "The digital bridge between human and machine."