Technology Tap: CompTIA Study Guide

History Of Modern Technology: OLED and Beyond | Technology Education

Juan Rodriguez Season 5 Episode 106

professorjrod@gmail.com

Explore technology education and IT skills development through the history of modern tech, from CRTs to OLED displays.


A single glowing dot in a glass tube changed how we understand the world. We follow that spark from Carl Ferdinand Braun’s cathode-ray breakthrough to radar operators reading life-and-death blips, to living rooms lit by television and desktops shaped by GUI windows. Along the way, we show why screens didn’t just display information—they taught humans to think in frames, patterns, and pixels.

I walk through the interface pivots that mattered: when computers stopped spitting paper and started talking back visually; when text terminals gave way to Xerox PARC’s icons and pointers; when Apple and IBM normalized monitors as the heart of personal computing with standards like CGA, EGA, and VGA. Then we dive into the flat panel turn: the physics of liquid crystals, the jump from passive to active matrix TFT, and the moment LCDs escaped laptops to conquer the desk. We weigh plasma’s cinematic highs and practical lows, and how LED backlights, higher refresh rates, and HDR transformed clarity, contrast, and color.

From there, we explore OLED’s promise—self-emissive pixels with true blacks, flexible forms, and motion precision that redefined smartphones, TVs, and creative workflows. We compare Mini‑LED’s local dimming advances and MicroLED’s potential for brightness, longevity, and perfect blacks, while noting the manufacturing roadblocks. Finally, we look ahead: curved, foldable, and rollable designs that adapt to you; VR and AR that pull displays onto your eyes; and early steps toward holograms and light field systems that project depth without headsets. The through-line is simple and profound: as control over light improves, the screen fades and the experience takes its place.

If this journey reshaped how you see your monitor, share it with a friend, subscribe for more deep dives, and leave a review to help others discover Technology Tap.

Support the show


Art By Sarah/Desmond
Music by Joakim Karud
Little chacha Productions

Juan Rodriguez can be reached at
TikTok @ProfessorJrod
ProfessorJRod@gmail.com
@Prof_JRod
Instagram ProfessorJRod

SPEAKER_00:

And welcome to Technology Tap. I'm Professor J. Rock. In this episode, the history of display monitors. Let's tap in the back of the day. So assumed that most of us forget how revolutionary it really is. Not the keyboard, not the modem, but the window into the digital future. The display monitor. From the first glowing dots on scientific tubes to massive wartime radar screens, to the soft colored rectangles on your desk and the crisp OLED panel in your pocket. The story of the display monitor is the story on how humans learn to see information. Because before computers can talk to us, they had to learn how to show us. Let's tap in. Let's begin. Not in Silicon Valley or Tokyo, but in late 19th century Europe, where a scientist study electrons in a glass tube. Long before the first computer, before anyone imagined a glowing rectangle repeating our thoughts, inventors played with charged particles inside vacuum tubes. In 1897, a German physicist named Carl Fernand Braun demonstrated something remarkable. Inside a glass tube, a sealed glass tube, he fired electrons at a fluence screen. When they struck, a tiny point of light appeared. He called it Cathar Ray tube or CRT. It was a little bit more than a flickering dot, but it was the first time electricity created an image on command. At that moment, the idea of electronic display was born. There were no computers, no television, no graphics, but the mechanism had arrived. Every revolution begins humbly, sometimes with a single glowing dot. For the next decade, that dot would evolve into lines, and then into pictures. The early 1900s brought refinement. By steering electron beams with the magnetic field and voltage control, scientists found they could move that glowing dot across the street, the screen. Sweep it fast enough, and your eyes blurred the light into a line. Do that line after line and suddenly an image can be drawn. The fundamentals of screen technology, scatter lines, refresh rates, bright phosphors glowing under electronic bombardment were being defined before the word television even existed. Names like Warskin and Bard, often associated with early TV, built devices that created moving pictures with cathlon rays. And so, decades before the personal computer, the CRT was already teaching humanity how to think in frames. No technology escapes the pull of war. The CRT became essential during World War II, not for entertainment but for detection. Radar. Radar brought new urgency to visualizing invisible information. Incoming radio waves reflected from enemy aircraft were displayed as little blips of light on circular CRT screens. Air defense crews stared at glowing green dots to determine friend from foe. It was high-stakes visualization, the difference between defense and disaster. In dusty rooms under blackout curtains, operators learned the truth every computer science will later accept. Human beings understood patterns by seeing them. Numbers alone were not enough. The simple truth, the vision drives comprehension, will define everything to follow. The first digital screens didn't show movies, they showed danger. When peace returned, the CRT left the command center and entered the living room. Television sets from the 1940s and 50s brought moving images to millions. Black and white phosphorus glow became a global shared experience. Though not used for computing yet, television trained the world to believe in screens. The CRT became familiar, trusted, even expected. Behind the glass, an electron an electron gun scanned left to right, top to bottom, 30, 50, or 60 times per second. This repetition, too fast for the eyes to isolate, formed the illusion of continuous picture. A simple trick of physics and perception gave birth to the modern display culture. And in doing so, it prepared society to accept computers when they arrived. Because by then, people already knew how to look at a glowing screen. The first electronic computers of the early of the 40s and the early 50s didn't have monitors. Output came through printouts, lamps, or punch cards. But computers grew faster. Humans needed something more responsive. A way to interact in real time. Enter the CRT. Early displays were a little more little skeletoscopes, scopes that drew patterns, waves, and text. On machines like the World Wind at MIT, engineers adopted CRTs to show characters, allowing researchers to debug programs faster. Soon after, researchers paired CRT displays with keyboards, introducing direct interaction between human and machine. This was the beginning of the digital display, a monumental shift. The computer was no longer a silent box, but a partner that can talk back visually. The moment computers learned to display, humans began to collaborate with them. The screen became the medium of communication. In the 60s and 70s, computer display showed mostly text. Green phosphor characters on black, 24 rows, 80 columns, just enough to hold a page. It was minimal but transformative. Systems like DEC's VT100 terminal became iconic. The interface of early programming, networking, and digital business. Engineers and students around the world learned to speak machines through text displays. CRT terminals became the voice of computing, quiet but authoritative. And though primitive, they did macrosized access to computational thinking. No more feeding stacks of punch cards, just type and see. Instant feedback, a radical concept. Graphics arrive. Still, humans craved images. In the 1970s, gave us the first computer graphics display. At Xerox Park, that's the Palo Alto Research Center. Researchers built workstations that displays windows, icons, and pointers. Not just text, but a visual language. It was the dawn of the GUI, the graphical user interface. The CRT allowed designers to present information visually, menus instead of columns, icons instead of memorized syntax. This was a philosophical leap. Computing adapted to human perception, not the other way around. The screen was no longer a terminal, it was a canvas. The display didn't just show information, it shaped how we understood it. And soon the revolution will reach the masses. Apple, IBM, and the desktop age. The late 70s and 80s brought computers into homes and offices. With the migration came the monitor. Apple's early machines, the Apple II and Macintosh, used CRTs to show crisps, friendly interfaces. IBM's PC standardized the idea of a monitor separate from the computer, a modular design. Graphic standards emerged, CGA, EGA, VGA, each improving color depth and resolution. From blocky pixels to smoother text, stone carved glyphs to artistic icons, the screen became a foundation of meaning. Soon we were just we weren't just reading CRTs, we were designing, illustrating, editing, videoing, and playing immersive games. The monitor has become a portal. Right and the uh and the BGA, those were good times. It was good times to be in that time. It was an interesting time to be in. The age of the flat panel, LCD, plasma, and the LED revolution. When the 80s ended, the world's desks were dominated by big boxy monitors, the glorious glass faces of CRT technologies. But behind the scenes, engineers were already dreaming of something slimmer, lighter, and more efficient. A display that can liberate computers from the bulk of their tubes. This is a story on how we went flat. Every generation tries to make the screen thinner, but we really want to is to make them closer to ourselves. The limit of glass. By the early 1990s, CRT monitors had reached their limit. They were heavy, deep, and energy hungry. A 21-inch screen weighed over 50 pounds and they were they were heavy. Yet the world was demanding more pixels, more colors, more portability. Laptops were emerging, and a glass tube simply wouldn't fit. The next revolution had to be something different. A display made by not bombarding phosphor with electrons by but by controlling light itself. The liquid crystal breakthrough. Liquid crystal sounds like science fiction, a substance that's both fluid and ordered. But the idea goes back to 1880 when an Australian botanist named Frederick Reine Eisen noticed that certain organic molecules became milky before melting. They flow like a liquid, yet the struct they had structure like a solid. Decades later, scientists discovered these materials could rotate polarized light when an electric field was applied. By the 1960s, research at RCA and Sharp realized that they can use a property to create a display without moving parts. It was the birth of LCD, the liquid crystal display. Sometimes progress comes not from building machines stronger, but from making matters more delicate. From calculators to computers. The first LCD appeared in the 1970s in rich wrist watches and calculators. There were simple seven-segment displays, low-powered, easy to read in sunlight. Then came the first portable computers. Sharp and Toshiber pioneered laptop designs using passive matrix LCDs, slow to refresh and often blurry, but revolutionary in size. By 1989, IBM's ThinkPad and Apple's PowerBook proved that computing can go mobile, thanks to LCD. Still, passive matrix screens had a problem. They were sluggish, ghosting every movement. The solution came in the form of TFT, thin film transistors, giving each pixel its own electronic switch. This active matrix LCD brought clarity and speed, a display worthy of the new digital age. The flat panel revolution. By the mid-1990s, the LCD was really ready to escape the laptop and conquer the desktop. Companies like Samsung, NEC, and Sony introduced standalone flat panel monitors. At first, they were luxury. A 15-inch LCD cost over a thousand dollars. I remember those times. It was expensive. But they offered what no CRT could: a thin profile, no flicker, and in low energy consumption. By 2003, the tide has turned. LCD sales surpassed CRTs for the first time. The era of the flap panel had arrived. The world wanted to see clearly and to see lightly. The plasma promise. While LCD took over computers, television entered their own revolution. Plasma display developed by Donald Blitzer and Jean Solowo at the University of Illinois in the 1960s used tiny cells of ionized glass to emit light, each cell a miniature neon lamp. For decades they remained a laboratorial curiosity until the 1990s, when companies like Pioneer and Panasonic turned them into large television. Plasma screens could reach sizes no LCD could match: 40, 50, 60 inches with rich contrast and deep color. But they were expensive and power hungry. They glowed bright and faded fast. Every bright idea has a half-like. By the 2010s, plasma had vanished. A transitional glory in the quest for perfect light. Yeah, the one thing about the plasma is that if you put it to the side, like if you didn't mount it carefully, and you had like it was tilted to one side, the color would go all the way to one side. And if you left it flat, forget it. Like if you were to put it down flat, forget it. That was the end. And and I remember towards the end, the plasmas which were selling cheap. The the all the TV, all like Best Buy, you know, they all wanted to get rid of them. Circuit City, they all wanted to get rid of the plasmas, and they were selling them for a deep, deep discount. I remember my dad bought one, and I'm like, he's like, Yeah, I got a big-inch plasma TV for like 200 bucks. And I'm like, Dad, they don't they're getting rid of plasma, right? Plasmas are really no good, but he already bought it, so all right, the LED Renaissance. As LCD grew dominant, engineers sought to improve their backlight. Early LCD relied on CCFL, cold catholic forensic lamps, tiny tubes that provided white light but added bulk. The solution was to replace those tubes with LEDs. Smaller, cooler, and more efficient, LEDs allowed displays to become thinner and brighter. By the late 2000s, LED TVs and monitors dominated store shelves, though technically still L C Ds, their backlights made all the difference. Edge-lit LEDs made screens slim as picture frames, while fully arrayed LEDs added local dimming for deep contrast. Backlight became a metaphor, illumination from behind like insight itself. Pixels, power, and precision. By 2010's resolution became the new arms race. HD, full HD, 4K, 8K, numbers that once measured scientific instruments, now defined living room bragging rights. Fresh refresh rate soared 60 Hz to 120 to 240. Color dip expanded 10-bit HDR. Gamers, designers, citographics pushed for perfection in every pixel. And the LED panel, once a novelty, have become the canvas of the century. We no longer look at the screens, we look through them. We follow the evolution of display technology from glowing vacuum tubes to portable LCDs to the LED lit rectangles that power desktop and living rooms worldwide. But the story doesn't end there, it accelerates. Because the display has moved beyond showing images. It shapes our reality, it surrounds us. It bends, curves, folds, and sometimes disappears into thin air. Welcome to the modern age of display, where pixels are no longer fixed squares of light, but living, organic, adaptive elements that redefine how we see the digital age. The rise of OLED, when pixels became organic. In the early 2000s, a new idea emerged. What if the pixels themselves emitted light instead of requiring a backlight? Enter OLED, organic light emitted dietal technology. These pixels are made of carbon-based molecules that glow when electricity is applied. No backlight, no filters, just pure direct emission. The benefits were startling. Infinite contracts, true blacks because black pixels simply turn off. More vibrant colors, ultra-thin displays, flexible panels. The first OLED display appeared on phones and MP3 players. Then Sony unveiled the XEL-1 in 2007. A television with only 3mm thick. It shocked the world, not just for clarity, but for its promise. OLED didn't just improve displays, it liberated them. Today OLED dominates the premier smartphones, TVs, VR headsets, and creative studios. Millions of people look at OLED screens every day without knowing it. As mobile device becomes the center of mobile life, companies search for ways to enhance OLED. Samsons pioneer AM OLED, Active Matrix OLED, bringing faster response time and deeper saturation. Apple refined it with Super Retina and XDR displays, elevating brightness, accuracy, and contrast. Google, HUI, and other countless others followed. OLED became more than a technology. It became a standard for high-end displays. The smartphone era didn't just shrink the computer. It miniaturized the display and put billions of perfect pixels in our pockets. For the first time, humanity carried a personal window to the digital universe. HDR, the battle for brightness. Resolution is easy to measure, but perception, how we experience images, depends on contrast and brightness. Enter HDR, high dynamic range. HDR brings brighter highlights, deeper shadows, expanded color, details in extremes of light. Developed by Adobe, Samsung, and other pioneers, HDR transforms everything from movies to gaming. Whatever Adobe Vision, HDR 10 or HLG, HDR added emotions to pixels. It made sunsets richer, city lights sharper, and dark scenes more dramatic. HDR restored the drama of reality pixel by pixel. Mini LD and micro ED precision light. L C D wasn't done yet. It reinvented itself through new backlights. Thousands of small LEDs creating created many local dimming zones, improving contrast, brightness, HDR performance. Micro LED, the future successor, each pixel gives its own microscopic LED. No organic material, no burning, ultra long lifespan. Micro LED combines the best of OLED and LED. Perfect blacks, extreme brightness, incredible durability, but manufacturing remains complex and expensive. It's coming but slowly. Innovation often begins big, then shrinks until it fully disappears into invisibility. Curve foldable rollable screens unbound. Thanks to OLED and flexible substrats, screens no longer have to be rigid. We now have foldable phones, rollable TVs, curved gaming monitors, wrapping digital signage. The dream of the 60s, a screen that own rolls like paper is no longer fiction. All this flexing is reshaping design. Phones that serve as tablets, monitors that envelop the person, televisions that disappear when not in use. The screen no longer sits in front of us, it adapts to us. VR and AR displays on your eyes. The next frontier isn't on a desk or a wall. It's in your face. Virtual reality and augmented reality require displays with extreme high pixel densities, fast refresh rates, low persistence, and optical precisions. Companies like Oculus, Meta, Sony, Valve, and Apple have pushed display technologies into the realm of optics. Retin-level VR requires 60 to 70 pixels per degree, a density far beyond traditional screens. AR goes further, blending the digital and physical worlds, holographic projections, transparent displays, and wearable interfaces. Thing about VR, they make you dizzy after a while. I mean, you're they I think they're getting they might be getting better and better, but can they sell it to the humans? Are the humans gonna adapt to it? How do we adapt to these VRs and ARs without getting nauseous or sick? The monitor is no longer a device, it's an environment. Holographic and light field displays. Beyond VR and AR, like the experimental displays of tomorrow, holographic displays, true 3D images projected into space. Companies like Looking Glass and Leah lead early development. Light field displays, displays that simulate how light travels, allowing depth without glasses. Retinal projection, devices that paint images directly into one's retina using safe low energy laser. The monitor is dissolving, becoming something more natural, more immersive, more integrated into human perception. As the screen disappears, the experiment becomes a display. The universal screen. We now live in a world where displays are everywhere. Phones, tablets, desktops, billboards, smartwatches, vehicles, appliances, eyewear, storefront windows. All surface can become a screen. Any moment can be eliminated with information. What began as a flickering dot in a glass tube has become a global canvas of pixels. Final reaction. The monitor is more than in a device. It is a testament of human ingenuity, our desires to see, to visualize, to project information into the world. From Kathmall rays to liquid crystals to organic light to holographic fields. Every generation demands a clearer picture. And every display is a step forward towards making information as real and as meaningful as the world around us. Screen shows data, but more importantly, they show ourselves. Thank you for joining me on the history of displays technology. I'm Professor J. Rod. And until next time, stay grounded, stay curious, and keep tapping into technology. We are now part of the Pod Match Network. You can follow me at TikTok at ProfessorJrod at J R O D, or you can email me at professorjrod at J R O D at email.com.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Crime Junkie Artwork

Crime Junkie

Audiochuck