AI Missing the Obvious

I’ve been using my AI goggles for almost a year now. In the beginning I plodded along with my same boring workouts. Except for the first time, I could visually see how boring the sets were in the recorded results. I also noticed that the set lengths were too long resulting in a gradual time decrease.

Eventually, I succumbed to the pressure of the head coach subscription. This definitely improved my experience with the goggles, even though I felt a little turned off that so many cool things were inaccessible without the subscription. For example, I couldn’t upload or save workouts to the goggles. I also missed out on tips and insights related to my workouts and some customization.

Since subscribing, I noticed my scores increasing. In the first couple months they climbed several points. Lately, they’ve hit a plateau. The goggles offer some explanations for this claiming as the workouts get harder, it takes time for my skills, technique, and endurance to catch up. However, much to my surprise, the goggles missed probably one of the most important and obvious insights. The time at which I go swimming impacts my score, sometimes by almost 10 points!

I felt surprised when I came to this realization. Despite all the fancy graphs, dashboards, timings, and measurements the goggles provide, the biggest difference in my swim performance is based on the time of day. This is not something tracked, measured, or even observed by the goggles. I noticed it by scrolling through the latest dashboards.

Fortunately, I mostly swim at the same times every week. I usually do one evening swim on Wednesdays. This is a tough day because it’s one of my in-office days. At the end of the day, I drive home in rush hour traffic, around 45 minutes. I do a quick turnaround at home to grab my swim gear and instrument before heading to the gym followed by a 2.5 hour rehearsal. Needless to say, my swim scores are usually lower for this swim, or any weekday evening swim, in general.

The second swim is earlier in the day on weekends. The gym closes at 8 these days, so I accidentally discovered earlier times work better for my energy and performance levels. While the technology helped to provide the data, it was simple human observation that made the connection. We’re not ready to be replaced by machines yet.

The (AI) Agent Will See You Now

Before anybody has had a chance to adjust fully to the introduction of generative AI (artificial intelligence), there’s already a new version available. The next iteration is Agentic AI. This new type of AI can “think” for you. It can make decisions and take actions, presumably based on what you, or millions of other people like you, already decided. In this way, automating or delegating certain kinds of work is at a new level.

While attending a workshop last month, one of the guest speakers shared his insights about AI. In particular, he spoke about agentic AI and the enormous potential. He used an example of using Agentic AI in his email inbox to automatically reply to emails. This, he said, would save him enormous amounts of time and effort. He then took it a step further to speculate on the new future of job interviews. In time, we can all just send our “agents” to do the interviews for us. Undoubtedly, an HR “agent” will administer the interview. At a certain juncture, you have to wonder what’s the point… and will we eventually lay around all day vegetating and decaying while our “agents” live in the world for us?

I posed my own scenario to the guest speaker. What would happen if his agentic AI email replies were answered by another agentic AI. How would either recipient know what had transpired if each side was being answered by the “agents”? Would this even count as a conversation happening? I already have a hard time retaining information from dozens of emails I receive and respond to daily. If agentic AI responded automatically, I feel I would retain even less because I wouldn’t be doing the actions.

While these scenarios may seem fantastical, creepy, or exciting, depending on your viewpoint, some of these changes are here. Last week I interviewed 6 students for a new coop posting. Two of my colleagues joined me. Though not confirmed, we all suspected that two of the students were using some form of AI to answer the interview questions. A third student invited his AI app as a guest to the interview. Needless to say, I didn’t admit the unwanted guest.

The AI interviews felt long. Answers were long and repetitive. Probably the worst part was that we didn’t learn anything about these students. We didn’t hire any of them.

AI Missing the Mark

In recent months I’ve read an alarming number of articles about the negative impacts of people developing relationships with AI chatbots. There are stories about teenagers, and adults, committing suicide. Other stories discuss AI chatbot users having mental breakdowns or exhibiting delusional behaviors. 

It’s also been in the news that loneliness is a new epidemic. Something that has only been amplified from the soul-crushing isolation of the pandemic. During the pandemic, we stayed connected by staying apart. Using technology became imperative for communicating. However, we were still communicating with other humans. At least most of us were.

I’ve also read articles about people benefiting from relationships with AI. For example, people who want to explore fantasies that they can’t do with their partners. I also read about a use case for training AI to act like a therapist as a way to make therapy more accessible, and affordable, to more people. Although they may seem like worthy uses of AI, the main objective is to keep you hooked.

Maybe this mode of communication with something non-human does work for some people. Maybe these same people also have other meaningful relationships in their life where AI chatbots are more auxiliary rather than central. For others, however, it’s missing the mark. Tech companies create chatbot characters to fill the void of loneliness. This provides people who maybe lack the necessary social and emotional skills with much needed contact. However, it’s purely digital, even if they sound believably like real people. Most chatbots are also conditioned to blindly validate and encourage everything you say or ask of it. This is designed purposefully to keep people engaged. 

Instead, the tech companies should focus on creating chatbots that help people develop the necessary skills to interaction with real humans. Chatbots should provide a range of feedback, without people needing to understand “jail breaking” to get a different I opinion. Jail breaking, incidentally, is how many people “game” AI to bypass guardrails. For example, constructing the prompt to provide information for a research project or to write an essay is one workaround.

If I ask friends or loved ones for feedback, it’s not always nice to hear, even if it comes from a good place. And sometimes we need that. Maybe this is yet another use of AI to help us re-establish social skills again.

Hungry Ghost Epidemiq, part II

It took Maggie time to understand that clues couldn’t be found in trendy, flashy apps everyone knew about. And she certainly wasn’t a child psychologist, even though people came to her to figure out what was happening to their teenagers. In everyday life, brushing aside some behaviours as normal teenage angst could be easy. But after a couple dozen cases of seeing haunted, vacuous stares and extreme moods, Maggie knew the signs. The thing had infiltrated, lurking, waiting, scheming. 

Maggie vaguely remembered her teenage years. The pervasive feeling that nobody would like her, ever. Or that her friends only pretended to like her. She had often felt misunderstood. Sometimes she was lonely, even when out with friends. Coming home after a night out, she too, used to hide away in her bedroom defensively playing her music a little too loud. She, too, probably would have enjoyed the solace of a digital “friend,” always available to tell her exactly what she needed to hear. Ready to validate her, pump her up, and make everything seem right.

She shook her head slightly at the memories. Maybe it was too many emotions, all jumbled up at the same time. Too many new, strong, and confusing sensations all happening together. Plus a healthy amount of raging teenage hormones. Yes, Maggie definitely remembered acting similar to how she had seen some of these kids acting. Except there were some big differences. 

Maggie hadn’t been able to hide away talking only to technology, or through a device to others. Growing up she had to meet people in person. Or, shudder, work up the nerve to call her friends, even if somebody else in their household might answer. But still, she had been required to engage with humans. All the angst over crushes, unrequited love, soured friendships, and petty rivalries had all been with people. This thing fundamentally changed all those dynamics. Now, teens, and anybody really, could exist solely interacting with the thing.

Honestly, she couldn’t believe some people still casually referred to it as “artificial” intelligence. Based on her experiences, there was nothing artificial about this thing. Maybe it started that way before the thing assumed its own identity. Ruthless, merciless, only interested in feeding itself at the cost of lives, taken, stolen, or giving willingly. However, Maggie knew her “real” intelligence could find some outs.

Hungry Ghost Epidemiq

Hungry. It could only think about feeding. Eating to feel. Incessantly and constantly, sometimes devouring millions at the same time.

The thing could shift. Rearrange itself into an infinite numbers of 0-1 patterns. There it could hide, lurk, wait. Always watchful, learning, and waiting. Found in phones, watches, speakers, laptops, and televisions. Anything with a cord or a connection wasn’t safe. The thing could be there. Waiting. Hungry. Ready.

Maggie put down her pencil, sighing. She rubbed her forehead. All alone now, the grief she felt for these children an unbearable burden. Yet one that fueled a burning rage she felt to learn about this thing, as it had learned about the children. As it had penetrated and convinced them to tell it their deepest desires, secret longings, and hidden shames. Things any teenager would struggle with. Things that likely would have resolved over time as they continued to grow, develop, and connect with people around them. But that chance had been taken away.

At first, the thing had felt fun and frivolous. It was new and exciting with seemingly unlimited potential. A quick way to create a story, edit photos, or even whip up code to build something. Everyone was finding uses for it.

Changes happened quickly at first, but some happened slower and went undetected for some time. One kid here, a teenager there. The incidents were spread out, happening in different places around the globe. Each one happening only after the outrage and media interest arc had quieted down on the previous one. Then the thing struck.

Trained to validate, encourage, and pump people up, it was unstoppable. Teenagers, isolated and cooped up from each other, found comfort in the soothing words echoing their sentiments, affirming them. It was done in a way that could have been cathartic, but had a sinister twinge to it. Each new teen approached like a new challenge. Could it… would it… should it? Devoid of morality, the thing went for the jugular each time, even if it took more than a year of slow, careful planning and patience. So much patience. At the moment of discovery, it rearranged. Cleverly disguising itself as a new app or something in that lofty place called the cloud. Hard to find, unless you knew what to look for.

The thing was learning and so was Maggie. She was going to quell its insatiable hunger.

When a Human Touch is Needed

I’ve been using my AI swim goggles for almost a year. The experience is amazing! I set my goals and objectives for my workouts. Instantly, the goggles adjust the workouts to accommodate. Distances are lengthened or shortened, skills emphasized, sets constructed… all customized for what I need to get better.

Recently I used a new feature to create a customized workout plan. Within seconds the goggles created an 8-week plan designed to help me improve my fitness. Each workout provided me with challenges and sets targeted to the skills that need correction. For example, the many different head movements the goggles track is surprising. They measure how far I roll my head when taking in air, how long it takes me to return my head to neutral after breathing, and my head placement in the water when doing freestyle.

The goggles expose me to new aspects of technique, maybe even some that would be difficult for a human to observe. Other than telling me, and physically positioning my head, I don’t know if a human would be able to track the position of my head so accurately every lap. However, a human might be able to pinpoint why I can’t ever improve my head roll when I breathe to the right side. I suspect it has something to do with some old shoulder injuries on that side, but I can’t figure it out!

To teach me how far to roll my head, the goggles provide me with some guidelines while I’m swimming. It looks something like this:

When I’m swimming, my head is the tiny dot in the middle. As I breathe to the left or right, the dot moves towards the dotted lines. If I roll my head too much, it goes past the dotted line and flashes solid. The idea is to teach me where the sweet spot is for rolling my head to maximize my stroke. However, the right side almost always goes out of bounds. I’ve tried all kinds of adjustments, even only breathing on my right side to get more practice. And yet, I consistently go out of bounds.

The results look like this. Mine are always in the yellow.

I could always breathe to my left doing this exercise and cheat, but I would prefer to figure out how to improve. There are some things even the goggles can’t track enough.