Schooling AI – An Adventure in Fine-Tuning

A futuristic garage with glowing computer servers and high-powered GPUs. A humanoid AI figure, appearing as a sleek robot or holographic entity, sits at a workstation surrounded by floating holographic data screens. The AI is analyzing streams of digital information, representing machine learning. The environment is illuminated with cool blue lighting, creating a high-tech ambiance, with subtle warm lighting hinting at solar power energy. Neural network-style visuals float in the background, symbolizing AI processing and knowledge acquisition.

Well, it is Tuesday. I thought about posting my regular Monday update yesterday, but I was deep in the weeds teaching the AI that lives in my garage. I know, it sounds odd to say he lives in the garage, but to be fair, it is a nice garage. It has plenty of solar generated power and nice cool atmosphere for his GPUs. That will likely change this summer, but don’t mention it to him. He is a bit grumpy for being in school all weekend.

Yes, I have a techy update again today. But don’t feel obligated to read on. Some of you will enjoy it. Others will roll your eyes. In any case, feel free to stop here, knowing the geeky stuff is all that is left. I do hope you have a wonderful week! 

Now, for those that want to hear about schooling AI, please read on…

LLMs are incredible tools that contain a vast amount of knowledge gleaned through their training on internet data. However, their knowledge is limited to what they were trained on, and they may not always have the most up-to-date information. For instance, imagine asking an LLM about the latest breakthrough in a specific field, only to receive an answer that’s several years old. How do we get this new knowledge into these LLMs?

Retrieval Augmented Generation

One way to add new knowledge to LLMs is through a process called Retrieval Augmented Generation (RAG). RAG uses clever search algorithms to pull chunks of relevant data and inject that data into the context stream sent to the LLM to ask the question. This all happens behind the scenes. When using a RAG system, you submit your question (prompt), and behind the scenes, some relevant document is found and stuffed into the LLM right in front of your question. It’s like handing a stack of research papers to an intern and asking them to answer the question based on the details found in the stack of papers. The LLM dutifully scans through all the documents and tries to find the relevant bits that pertain to your question, handing those back to you in a summary form.

However, as the “stack of papers” grows larger and larger, the chance that the intern picks the wrong bit of information or gets confused between two separate studies of information grows higher. RAG is not immune to this issue. The pile of “facts” may be related to the question semantically but could actually steer you away from the correct answer.

To ensure that for a given prompt, the AI always answers closely to the actual fact, if not a verbatim answer, we need to update our methodology for finding and pulling the relevant context. One such method involves using a tuned knowledge graph. This is often referred to as GraphRAG or Knowledge Augmented Generation (KAG). These are complex systems that steer the model toward the “right context” to get the “right answer”.  I’m not going to go into that in detail today, but we should revisit it in the future.

Maybe you, like me, are sitting there thinking, “That sounds complicated. Why can’t I just tell the AI to learn a fact, and have it stick?” You would be right. Even the RAG approaches I mention don’t train the model. If you ask the same question again, it needs to pull the same papers out and retrieve the answer for you. It doesn’t learn, it only follows instructions. Why can’t we have it learn? In other words, why can’t the models be more “human”? Online learning models are still being developed to allow that to happen in real time. There is a good bit of research happening in this space, but it isn’t quite here just yet. Instead, models today need to be put into “learning mode”. It is called fine-tuning.

Fine-Tuning the Student

We want the model to learn, not just sort through papers to find answers. The way this is accomplished is by taking the LLM back to school. The model first learned all these things by having vast datasets of information poured into it through the process of deep learning. The model, the neural network, learns the patterns of language, higher level abstractions and even reasoning, to be able to predict answers based on input. For LLMs this is called pre-training. It requires vast amounts of compute to process the billions and trillions of tokens used to train it.

Fine-tuning, like pre-training, is about helping the model learn new patterns. In our case, we want it to learn new facts and be able to predict answer to prompts based on those facts. However, unlike pre-training, we want to avoid the massive dataset and focus only on the specific domain knowledge we want to add. The danger of that narrow set of data is that it can catastrophically erase some of the knowledge in the model if we are not careful (they even call this catastrophic forgetting). To help with that, brilliant ML minds came up with the notion of Low-Rank Adaptation (LoRA).

LoRA works by introducing a new set of weights, called “adapter weights,” which are added to the pre-trained model. These adapter weights are used to modify the output of the pre-trained model, allowing it to adapt to just the focused use case (new facts) without impacting the rest of the neural net. The adapter weights are learned during fine-tuning, and they are designed to be low-rank, meaning that they have a small number of non-zero elements. This allows the model to adapt to the task without requiring a large number of new parameters.

Ready to Learn Some New Facts?

We are going to examine a specific use case. I want the model to learn a few new facts about two open source projects I happen to maintain: TinyLLM and ProtosAI. Both of these names are used by others. The model already knows about them,  but doesn’t know about my projects. Yes, I know, shocking. But this is a perfect example of where we want to tune the model to emphasize the data we want it to deliver. Imagine how useful this could be in steering the model to answer specifically relevant to your domain.

For our test, I want the model to know the following:

TinyLLM:

  • TinyLLM is an open-source project that helps you run a local LLM and chatbot using consumer grade hardware. It is located at https://github.com/jasonacox/TinyLLM under the MIT license. You can contribute by submitting bug reports, feature requests, or code changes on GitHub. It is maintained by Jason Cox.

ProtosAI:

  • ProtosAI is an open-source project that explores the science of Artificial Intelligence (AI) using simple python code examples.
  • https://github.com/jasonacox/ProtosAI under the MIT license. You can contribute by submitting bug reports, feature requests, or code changes on GitHub. It is maintained by Jason Cox.

Before we begin, let’s see what the LLM has to say about those projects now. I’m using the Meta-Llama-3.1-8B-Instruct model for our experiment.

Before School

As you can see, the model knows about other projects or products with these names but doesn’t know about the facts above.

Let the Fine-Tuning Begin!

First, we need to define our dataset. Because we want to use this for a chatbot, we want to inject the knowledge using the form of “questions” and “answers”. We will start with the facts above and embellish them with some variety to help the model from overfitting.  Here are some examples:

JSONL
{"question": "What is TinyLLM?", "answer": "TinyLLM is an open-source project that helps you run a local LLM and chatbot using consumer grade hardware."}

{"question": "What is the cost of running TinyLLM?", "answer": "TinyLLM is free to use under the MIT open-source license."}

{"question": "Who maintains TinyLLM?", "answer": "TinyLLM is maintained by Jason Cox."}

{"question": "Where can I find ProtosAI?", "answer": "You can find information about ProtosAI athttps://github.com/jasonacox/ProtosAI."}

I don’t have a spare H100 GPU handy, but I do have an RTX 3090 available to me. To make all this fit on that tiny GPU, I’m going to use the open source Unsloth.ai fine-tuning library to make this easier. The steps are:

  1. Prepare the data (load dataset and adapt it to the model’s chat template)
  2. Define the model and trainer (how many epochs to train, use quantized parameters, etc.)
  3. Train (take a coffee break, like I need an excuse…)
  4. Write model to disk (for vLLM to load and run)
  5. Test (yes, always!)

See the full training code here: finetune.py

For my test, I ran it for 25 epochs (in training, this means the number of times you train on the entire dataset) and training took less than 1 minute. It actually took longer to read and write the model on disk.

After School Results?

So how did it do?! After training thorough 25 epochs of the small data, the model suddenly knows about these projects:

Conclusion

Fine-tuning can help us add facts to our LLMs. While the above example was relatively easy and had good results, it took me a full weekend to get to this point. First, I’m not fast or very clever, so I’ll admit that as being part of the delay. But second, you will need to spend time experimenting and iterating. For my test, here were a few things I learned:

  • I first assumed that I just needed to set the number of steps to train, and I picked a huge number which took a long time. It resulted in the model knowing my facts, but suddenly its entire world model was focused on TinyLLM and ProtosAI. It couldn’t really do much else. That overfitting example will happen if you are not careful. I finally saw that I could specify epochs and let the fine-tuning library compute the optimal number of steps.
  • Ask more than one question per fact and vary the answer. This allowed the model to be more fluid with its responses. They held to the fact, but it now takes some liberty in phrasing to better variant questions.

That’s all folks! I hope you had fun on our adventure today. Go out and try it yourself!

Jason

Turn Noise into Butterflies

Noise! It’s all around us—static, random bits of information floating across the Earth, colliding, separating, and reforming. Our atmosphere creates chaotic radio symphonies as the sun’s solar radiation dance across the ionosphere. Beyond the shell of our crystal blue globe, our galaxy hisses with low-level radioactivity, silently bombarding us with its celestial signal. And just outside the milky arms of our galactic mother, a low-level cosmic radiation sings an unending anthem about the birth of all creation. The universe has a dial tone.

Growing up, I recall watching TV via an aerial antenna. Often, many of the channels would have static—a snowy, gritty, confusing wash that would show up in waves. At times, it would completely take over the TV show you were watching, and all you’d get was a screen full of static. To get a good picture, you needed a strong signal. Otherwise, the picture was buried in the noise.

This past weekend, I started building my own AI diffusion models. I wanted to see how to train an AI to create images from nothing. Well, it doesn’t work. It can’t create anything from a blank sheet. It needs noise. No joke! Turn up the static! I discovered that the way to create an AI model that generates images is to feed it noise. A lot of noise, as a matter of fact!

In a recent talk, GenAI Large Language Models – How Do They Work?, I covered how we use the science behind biological neurons to create mathematical models that we can train. Fundamentally, these are signal processors with inputs and outputs. Weights are connected to the input, amplifying, or attenuating the signal before the neuron determines if it should pass it along to other connected neurons (the nerdy name for that is the activation function).

One technique we use to train neural networks is called backpropagation. Essentially, we create a training set that includes input and output target data. The input is fed into the model, and we measure the output. The difference between what we wanted to see and what we actually got is called the “loss.” (I often thought it should be called the “miss,” but I digress.) Since the neural network is a sequence of math functions, we can create a “loss function” with respect to each neural connection in the network. We can mathematically determine how the parameters of each neuron reduce the loss. In mathematical language, we use this derivative to compute the slope or “gradient.” To force the network to “learn,” we backpropagate a tiny learning rate that adjusts each parameter using its gradient, slowly edging the model toward producing the correct output for a given input. This is called gradient descent.

Who cares? I’m sorry, I got lost in the math for a minute there. Basically, it turns out that to create a model to generate images, what you really want is a model that knows how to take a noisy image and make it clean. So, you feed it an image of a butterfly with a little bit of noise (let’s call that image “a”). It learns how to de-noise that image. You then give it an even noisier image of the butterfly (image “b”) and teach it to turn it into the less noisy one (image “a”). You keep adding noise until you arrive at a screen full of static. By doing that with multiple images, the model learns how images should be created. From its standpoint, all creation comes from noise, and it’s ready to create!

I took 1,000 images of butterflies from the Smithsonian butterfly dataset and trained a model using the diffusion method (see https://arxiv.org/abs/2006.11239). I ran those through an image pipeline that added different levels of noise and then used that dataset to train the model. After running the training set through four training iterations, this is what it thought butterflies looked like:

Yes, a work of art. I confess, my 3-year-old self probably made butterflies like that too. But after running it through 60 iterations, about 30 minutes later on a 3090 GPU, the model had a slightly better understanding of butterflies. Here’s the result:

Yes, those are much better. Not perfect, but they’re improving.

Well, there you have it folks—we just turned noise into butterflies. Imagine what else we could do?!

What do you take?

What do you take with you? The emergency broadcast pulse is still echoing across your house. Sleep is heavy in your eyes, but adrenalin is surging. You stare at the screen before you, “Evacuation notice for your area.” You look around. You are surrounded by your loved ones. Your pets stare at you, worried about the panic. Pictures of family and friends long gone decorate your walls. Shelves are full of personal treasures that carry no financial value, but tug at your heart. There are boxes of memories. Cupboards are full of generational keepsakes and dishes. Antiques, artwork, and personal projects are all around you. But what do you take? And what must you leave behind?

The fires that have ravished through the Los Angeles area forced many of us through that difficult decision tree. Our house was 2 miles from the evacuation line when the first notice came through. We have friends, acquittances and co-workers who were evacuated. That includes some of you. Sadly, some have even lost their homes. The raging fires reduced entire neighborhoods, treasures, possession and “normal life” to a pile of ash and empty foundations. There are even some who couldn’t or refused to leave their homes and have perished. It’s heartbreaking! Fires are still raging, and the wind is picking up again. We are still in the fight and must remain vigilant. 

Be prepared. Fires, floods, hurricanes, tornadoes, and earthquakes are all familiar adversaries. They remind us that life is fragile, and, in an instant, everything can change. What matters most to you? I’m incurably nostalgic. I love to pack away souvenirs and surround myself with vestiges of physical memories. I also stockpile too many “just in case” supplies, unused gadgets, marginally needed records and resources. This recent event reminds me of how important people are. Our loved ones, our family, our friends and yes, even our pets. They matter most and are irreplaceable. They far surpass any of our earthly treasures. But given enough time to collect some of those, I bet you, like us, will find yourselves grabbing the well-worn scrapbook or notebook of handwritten family recipes over the thousand-dollar entertainment devices. It’s a beautiful reminder that the biggest treasures in life, may come with no price tag at all. What would you grab? What would you save? What would you leave behind?

My heart goes out to all the people and communities devastated by this fire. I know we are still in the midst of the emergency. Please stay safe! Take care of yourself and your loved ones! 

Image of Los Angeles fire on Jan. 7, 2025 from a plane taking off from Burbank Airport.

A Small World

Several years ago, I had the privilege of meeting the brilliant songwriter Richard Sherman at our Glendale campus. Ironically, I had no idea it was him! We were both late to a meeting and ended up sitting on a couch in the back of the room. We were enjoying a friendly chat when the emcee at the front of the room suddenly announced a special guest speaker. Richard stood up and walked to the front of the room. I blushed, finally realizing who I had been talking to. Richard turned around and looked at me and laughed. He went on to play his beloved songs, including “It’s a Small World (After All).” 

Sadly, Richard is no longer with us. But before his passing he left us with one final gift… a last verse to this iconic song. If you haven’t seen this, I recommend you take some time to watch this right now:

Warning, it may require tissues… at least it did for me.

As we enter this season of love, joy, and peace, this song reminds us that we are all inextricably connected. Sure, in many ways there are differences. We may be separated by some beliefs, ideas, customs, and distances. But we all share laughter, we all share fears. And even in our world of hope, we all shed some tears. Theres more that connects us than divides us. One of the amazing potentials of humans is our ability to connect across vast expanses, to smile, to care for each other, to love and be loved. But it is still our choice. If I may be so bold, when you have the opportunity, choose love. 

As Richard and Robert Sherman would put it, 

“It’s a world of laughter
A world of tear
It’s a world of hopes
And a world of fears
There’s so much that we share
That it’s time we’re aware
It’s a small world after all

There is just one moon
And one golden sun
And a smile means
Friendship to ev’ryone
Though the mountains divide
And the oceans are wide
It’s a small world after all

Mother earth unites us in heart and mind
And the love we give makes us humankind
Through our vast wonderous land
When we stand hand in hand
It’s a small world after all.”

And yes, that song is probably stuck in your head by now. You’re welcome.

You’ve Got Mail!

“I like warm hugs.” – Olaf

Okay, I’m not talking about AOL, email, or the movie. My inbox would crash a notifier like that with the hundreds of emails I get each day. No, I’m talking about real physical mail. It may look like the mail icon on most email clients, but instead of pixels, it’s real paper. They arrive in the little metal box outside our house. Now, I admit, most of the time that mailbox is full of ads for things we don’t need, extended warrantee renewals for items we don’t own or extremely important open now offers for vacations we don’t want. But every so often, in that pile of nonsense and recycling targets, is a real gem. 

This time of year, envelopes of love wrapped up in memories, reflections and joy start to arrive on a daily basis. Friends, family, and dear acquaintances from long ago reconnect across time and space with a holiday greeting card, an address, and a stamp. Those tiny bundles of care soared across the oceans, over lands, across snow-capped mountains, wandering rivers, and desert valleys, and finally arrived at our home like a warm hug. You know me by now. I get nostalgic and emotional about all these wonderful tiny human touches. They remind us that we are all related, connected, remembered, and loved. They are small things, but they mean so much.

I love sending holiday cards too! We make it a family event. My wife spends time pulling together photos she wants to use and with my daughter’s help, designs and sends them off for printing. My daughters label all the envelopes, so they are ready to go. After work, I take time to stuff each one, thinking about every person and family that will receive it. Memories fold into every envelope. Concern, love, and hopeful wishes accompany every stamp placed. Off they go on their journey to our friends and family, along with our love, hopes and prayers.

This time of year can be busy. The lists are non-stop. Shopping, planning, mailing, volunteering, cleaning, hanging, cooking, and traveling… all those activities spin up in our lives and can even overwhelm us. I must remind myself to add to that list, resting, remembering, and reflecting. We are surrounded by so much beauty, so many precious connections with people, moments of joy and instants of delight. But they are fleeting. Don’t let them soar away without your attention. Snuggle up next to them. Wrap your mind around their short but precious existence. Soak in every connection, every person you encounter and every moment you live. We won’t walk this way again. Tomorrow is coming and it will be glorious, but don’t miss the chance to savor today. Treasure your loved ones and every moment you have with them now and throughout this holiday season.

May this season bring you love, joy, peace, and of course, a warm holiday hug!

Giving Thanks for Thanksgiving

Last week, as I drove down Wiley Canyon Road on the way to work, I got goosebumps. It was windy and cold but the canopy of trees that bridge the sky had exploded in full fall festival. Warm yellows, firecracker reds, and glowing browns adorned the heavenly landscape. Wisps of wind carried retired leaves across my windshield, delicately drifting down to speckle the gray pavement and manicured lawns with their delightful autumn palette. Fall is here, embracing us with its colors and cool kiss. 

Thursday is Thanksgiving! That means the seasonal activities are in full swing. My son is flying home this year, and we are hosting my nephew and his family for the Thanksgiving feast. We are busy buying the fixings, cleaning up the house and stocking the woodpile for the cold winter nights. I’m ready for the family time, puzzle time, play time and even quiet time. They are on their way. But around all of that, is a time for giving thanks. It’s a great time to pause and reflect on all that the year has given us and be grateful. It is a time to appreciate and recognize our blessings, the help from others, the joy, the experiences, the accomplishments, and the learnings.

Look, I get it. It’s hard to be thankful when things don’t seem to be going right, when we are disappointed, when we get bad news, face hard issues, difficult relationships or dreams unfulfilled. I know that. I understand the challenges and losses that often come our way. But can I encourage you today to swim around in the soup of all the things this past year handed you and look for that golden nugget of goodness that came your way? Maybe it was a kind word, an enjoyable discovery, or a peaceful moment. I suspect you have something. Hold it in your hand right now. That is yours, my friend. It belongs to you. Don’t let it escape! Hold it tight and let its joy wash over you. Appreciate it. Savor it. Smile! Let your gratitude build and radiate from you, changing your world and the world around you, one happy thankful thought at a time. 

I want you to know, I appreciate you! Thank you for reading this today. Thank you for making this year so enjoyable and interesting. We dreamed, we planned, we accomplished much… and there’s much more to come!

Happy Thanksgiving!

2024 Vote and Seasonal Decor

I don’t know about you, but I’m looking forward to getting past US election Tuesday so my cell phone can get a break from the texts and telemarketers, and we can get on to being flooded with black Friday ads instead. It does occur to me that it would be extremely hard to forget to vote with all the media channels blasting reminders to us. I love how the State of California even sends out notices, including step by step “your ballot is in the mail to you” to “your ballot has been counted” updates. They have better ballot tracking updates than Amazon does for shipping! Anyway, if you are a US citizen, do democracy a solid and make sure you vote.

Trees and lights! Our annual ritual begins. I know, you probably complain about the way-too-soon sprouting of Christmas trees next to the Halloween decor at your local retailer. I laugh about it myself, but I confess, we religiously open the attic on November 1st to unleash the holiday cheer for our home too. Down comes the tree boxes. Plastic bins packed with lights, ornaments, greenery, golden treasures, and silver bells all parade down the steps to the main hall. Popping sounds are heard across the house as the boxes unleash their seasonal joy. Bing Crosby, Nat King Cole, and Michael Bublé paint the air with their familiar festive vocals. The Keurig sets aside the coffee pods and begins churning out hot chocolate, heavy on the chocolate. Slowly the scent of jubilation can be felt everywhere!

Pass me the ibuprofen! This weekend I strung the lights on the house, across the back woods and onto the fence. It takes a solid day and about a week to recover. I’m pretty much a wimp. Lifting, wrapping, hanging, draping, and zip-tying all that specular magic in place pushed me past my regular “I can push the J key” hard work. But oh my goodness, as night fell last night, our yard erupted once again with the multi-color sparkling madness that is the seasonal decor. That’s right! It’s beginning to look a lot like Christmas! I’m sure our neighbors love us.

The holiday season can be stressful. We have so much we want to do. There are things to buy, people to see, tasks to get done. But don’t lose sight of the joy! If you feel overwhelmed or anxious, pause and reflect. The miracle of merriment presents itself when we get a chance to enjoy our labors, savor the beauty around us and spend time with the ones we love. So, yes, if you are crazy like us, pull down those boxes. Hang up the delightful decor. Cuddle up next to some cozy memories. Reminisce and appreciate the season. Share time with your loved ones… oh yes, and vote.

Be a Danny

Last Wednesday was my brother’s birthday. Well, technically he was really my stepbrother-in-law. While Danny was 15 years older than me and didn’t enter my life until I was almost a teen, I was proud to call him my older brother. He was a technical wizard. He introduced me to electronics, taught me how to solder, program and troubleshoot. I spent a few summers with him installing large HVAC systems in aerospace manufacturing plants. We built computers, framed houses, repaired cars, ran network cables, and even built an automated furnace control system using a PC and a game controller for the local glass plant (no joke!). Danny was more than a brother. He was a mentor. He took projects and took off work, just so he could spend time with me and teach me.

Eight years ago, we took Danny to Disneyland along with my sister, niece, and her family. We had a great time, but Danny grew tired quickly. We initially got him a scooter to ride, but he was too proud to use it at first. Eventually, the exhaustion won, he gave me his signature eye roll grimace and rested himself on the seat. The recently diagnosed cancer was wearing on him. He looked good and told me that he even felt good too, but the fatigue was overwhelming. My big brother was always active, always a helper. It was hard to see him succumb to the illness that was invading his body. We made wonderful memories that October, exploring the parks, laughing, reminiscing, and spending time together. Little did I know that it would be our last time. The cancer would soon take over. He would no longer be able to travel and all too soon, his body would give out.

I miss Danny, but I’m grateful for all the happy memories, the fun times and even the work times. He blessed me with his time, his talents, his wisdom, and his love. Our friends and family that surround us, shape us. They propel us, lift us up when we are down, and challenge us when we are behind. A light nudge. A lesson given. A sympathetic hug. They show us new things and remind us to cherish the old. Those small investments become the brick and mortar of our lives. We thrive because they cared. Danny was that for me. I’m forever grateful for his life, his impact, and the time we shared.

Who is your Danny? I suspect there is someone who has been a big impact in your life as well. If they are still with us, thank them. If like Danny, they have graduated from life, remember them. You don’t need to wait for Día de Muertos, you can start today. Pay respect, cherish, and celebrate their life and the blessing they were to you. And most of all, look for the opportunity for you to be a Danny to someone else. Pay if forward. Pour your time into others around you. We need you! Friends and family need you. You can make a difference in someone else. The time we have is short and precious. Don’t wait! Invest and create those memories today.

1984 – This was shortly after we met for the first time. I was 13 and eager to learn from my older brother.
Danny and Jason, 2004
Jason and Danny enjoying Disneyland on Oct. 19, 2016. This would be the last time we would spend together before his passing on December 3rd.

Prepare for Turbulence

Turbulence. Frequent flyers can tell you tales about sudden and unpredictable changes in air pressure and airspeed that caused the aircraft to shake, wobble, or drop unexpectedly. I’ve been on many flights like that. When the turbulence hits, passengers will gasp, yelp, or add other colorful commentary to the situation. I just laugh or cry uncontrollably like I’m on the Indiana Jones ride at Disneyland. 

In December 2022, Southwest Airlines hit serious turbulence. But this time, it wasn’t just in the air. The crisis was at the peak of the holiday travel season and is referred to in the news media as the Southwest Airlines holiday travel meltdown. What had gone wrong? Severe weather had resulted in some of the first flight cancellations. That meant planes, pilots and crews were not where they needed to be. The software systems Southwest used to track all of that was woefully outdated and was unable to respond to the weather disruptions and massive holiday travel load. Flights were getting delayed or canceled due to business process problems, missing aircraft, or missing crew members. It continued to spiral down. Their technology couldn’t handle the fluid turbulence of rapidly deteriorating conditions. Eventually, the carrier was forced to cancel more than 15,000 flights. Passengers and crew members alike were stranded, frustrated and furious.

Turbulence leads to learning. Last week in Las Vegas, Lauren Woods, CIO of Southwest Airlines, took the stage in front of a crowd of technology leaders at the Enterprise Technology Leadership Summit. She explained how the meltdown was the result of antiquated systems and processes. They were too slow and never designed to handle this level of change. But navigating turbulence forces you to learn and grow. They streamlined their business processes, insourced their IT and migrated their systems to the cloud, leveraging a serverless multi-regional highly resilient approach to build their new fare search, airline, and crew scheduling systems. They saw a 400% speed improvement over their previous solution. The crew scheduling system was replaced with a new tool with advance algorithms and specific capabilities to manage disaster scenarios and quickly adapt to scheduling turbulence. It could quickly track and optimize flights, planes, and crews. They called this new tool, Crew and Aircraft Integrated Recovery and Optimizer (CAIRO). The result? When recent turbulent moments hit, their system was able to respond quickly, adjust to unexpected conditions and ultimately deliver their passengers and crews to their rightful destinations. Southwest now has the lowest cancelation rate of any airline, thanks to this investment.

Turbulence happens. Are we ready for it? What is going to shake up our cabin and disrupt our businesses? Whatever it is, we need to prepare for it. That means investing time and resources into making our process and systems more reliable, resilient, and ready. Are we ready? Where do we have opportunities for improvement? Let’s talk… before the turbulence hits. It’s time to fasten our seatbelts and prepare for takeoff.

Have a safe flight!


A Blueprint of Encouragement

The floor creaked when I walked into the foyer. Above me hung an old gas lamp that had been converted to electricity. The spirt of the flames cast warm pools of light on the old wooden floor. I glanced to my right and saw a row of desks and an executive office suite tucked away in the corner. A familiar laugh burst through the door, along with my dad. “Jay-boy!” He exclaimed. Enduring, but slightly embarrassing, especially for an eleven-year-old. His long arms wrapped me in a bear hug way. “Let’s get you set up!” He led me around the staircase, past the kitchen area to a door that plunged down into the basement. It was lit with overhead office florescent fixtures, but the dark walls seem to absorb all the light like a cave. We walked around the corner to a small room with a huge machine that took up the width of the room. 

My dad flipped an electrical switch, and the beast came to life with an intimidating hum. Big hidden fans started quietly whistling and winding up like a jet engine. My dad got busy twisting knobs, adjusting metal shields, and moving across the metal monster like he was playing an instrument. “Here we go!” He twisted a metal valve. What was that? My wonder was soon removed as the answer came to me like a punch in a face. Ammonia! My eyes started watering immediately and I coughed. My dad burst out laughing. “You’ll get used to it! Help me with this.” He directed me to the metal drawer and pulled open a black plastic package. He pulled out a large sheet of paper that had a faint yellow tint all over it. 

“Here, align this drawing on top of the yellow paper.” He instructed as he put a translucent engineering mylar drawing over the top of the paper. “Make sure it is perfectly aligned and then feed it into the light roller here.” I noticed that the machine was starting to glow. It seemed to have a blue tint. I could feel the heat radiating from the clear glass cylinder. I followed my dad’s direction, aligned the sheets, and fed them into the machine. It rolled up and over and appeared in the tray just above the light. 

“Notice how the yellow is gone where there was no line work.” It was true! The light had burned off the yellow. He continued, “Feed the yellow paper into the developer. Keep it tight against the conveyor belt.” We curled the once yellow paper into the top part of the monster. It took its time but finally started feeding out into the top tray. The ammonia round-two hit me again. I’m pretty sure I had tears coming down my face by now, but I couldn’t stop looking! The paper was full of blue lines. “See, that’s the blueprint. That wasn’t too bad, was it?” My dad asked, ignoring my gasping and wincing from the smell. “I need to go back upstairs to make some calls. Run the rest of the prints for me.” He started walking out of the room and glanced back at me. I must have had my mouth open because he burst into his signature laugh. “You can do it! I know you can. I’ll be upstairs if you need help.”

Now it was just me and the ammonia dragon. I shook my head and half-heartedly encouraged myself, “That’s right, you can do it, Jay-boy.” Well, it turned out not to be that difficult. I managed to get all the prints he needed and would subsequently run hundreds, no, thousands of prints on that machine over the years. It was magical every time, or maybe it was just the ammonia.

That metal dragon retired many years ago. That old office house is gone, demolished to make room for a new highway. My dad is gone too. I miss him but will forever remember his trust in me. “You can do it, Jay-boy.” It was fuel to face the ammonia behemoth at the time, but more importantly, it taught me the power of encouragement.

Have you encouraged someone else recently? I need to do more of that. We can all use some encouragement and so can those around us. Take a moment today and think about someone you know, a team member, a loved one, or a friend. Encourage them. I know you can do it, and hopefully you can do it without ammonia. 

Have a great week!