Job Hunting in the AI Era

Speaking with a coop student last week, I was surprised to learn that she had applied to 250 positions! Having always been a bit specialized in my profession it was hard for me to imagine so many relevant opportunities. Even in my more aggressive days of job hunting, I could never even find that many available postings.

I remarked the use of gen-AI (generative artificial intelligence) likely made so many applications possible. Putting together a cover and adjusting a resume take time. Even with my timesaving resume formatting trick, modifications still require effort. Researching the company adds time to the process. This might involve reviewing the corporate website or doing a few quick google searches.

However, with gen-AI many of these tasks can be simplified and expedited. AI can easily summarize main points about a company. Or pull out summary points of a job posting and match it with resume highlights. Gen-AI can even write cover letters, tailored to the specifics of the job posting while pulling in relevant parts of a resume.

Although this sounds tempting, I’ve become adept at picking out AI-generated application content when hiring. Often when I review coop student applications, I get a whole pile of cover letters that sound eerily similar. I’m guessing it’s because the students all used similar prompts and used the same job posting to feed the gen-AI. In a lot of cases, key connections are missing with this method. Some students rely too heavily on AI without enough proof reading, or carefully guided prompts indicating a human driving the process more.

On the flip side, it’s likely that more companies are also using AI for the job posting and hiring aspects. Companies probably use AI to generate job postings. It can create summaries, write job qualifications, key responsibility points, and adjust the tone and style of the language. This takes mere minutes.

I’m sure many HR departments are now also flooded with AI generated applications. It would only seem natural for them to start using AI to filter the applications. It’s as though the AI agents are doing all the applying and hiring on both sides. If it isn’t at this point already, I’m sure it will be soon enough.

All this to say, it’s making the job market a very different landscape. One that is evolving rapidly on all sides.

Transcendence

Every time I play Beethoven, I feel transported. For my orchestra’s opening concert of the season, I had the good fortune to play Beethoven’s 6th symphony, the “Pastoral.” Few pieces, or composers, have the ability to put me right in the action with the same intensity and immediacy as Beethoven. From the opening phrase in the violin section I was in the moment. For 40ish blissful minutes, nothing else existed except for music.

Playing the first movement, I imagined myself traipsing through a forest on a warmish, spring day. Throughout, the five movements of the symphony, I could feel my parts synching with everyone else’s. Beethoven can be tricky because we don’t all enter at the same time, but if you stick to your part, it sounds right in the end. Somehow everything comes together. I recall a feeling of unity in the second movement when the clarinet and I matched our parts to sound like one voice lifting with the melody.

But maybe my favorite movement of the whole piece is number 4, the storm. It feels urgent and intense. When I play it I have the sensation that I’m going to bounce out of my seat from the energy. Finally, Beethoven brings it all together in the fifth movement, the calm after the storm. The clarinet opens followed by the French horn. The mellow hum of the horn feels like a ray of sunshine peeping through the clouds, warming my back. This might also be because the French horns play behind me.

At the end of the horn’s opening phrase, the orchestra comes in with the main theme. I especially like this part because each time the main theme repeats throughout the movement, I play an “A”. It’s not a particularly special note on the bassoon, but if feels gratifying to hold it at this particular moment in the piece.

I find it extraordinary to feel fully immersed in a non-digital experience these days. With so many unnatural beeps, chirps, squeaks, rings, and other notifications, it’s rare to enjoy something fully powered by humans. And it feels even better when you’re right in the center of it, sharing it with others.

Updating with Updates

A few weeks ago I got an update on my work computer. The update consisted of a new laptop with a new operating system. The mouse on the new laptop is extremely sensitive. For the first few days I found the slightest touch was enough to send the cursor flying. Or open random things while minimizing others. Even hovering my finger over the touchpad produced enough vibration to have an impact. This was the one big change requiring adjustment.

The other big change is the operating system. All of a sudden, everything just looks different and is too “smart” for its own good. Functions I relied on multiple times a day were in different places. Or difficult to find. For example, I often use the “delay delivery” option in email. I use this to schedule emails to be sent at a future time. This is useful when somebody is on vacation. The email can be scheduled when they’ve returned. Now, however, the feature is in a different place with a new name.

“Delay delivery” was the name I knew it by. Accessing the feature required me to pop out the email, otherwise It wasn’t available. The new name is “schedule send.” The option is easily located in a small dropdown menu directly next to the “send” button. It’s easy to miss, but also easier to set up. The option feels fresh, modern, and straightforward to set up, a definite improvement.

While I’m sure I’ll end up liking the new changes, provided I can figure out where everything is and how to use it, the adjustment is taking time. Assuming that end users, like me, will be able to integrate seamlessly to a new laptop that operates differently from the previous one creates challenges. Some basic things are transferrable, but it feels frustrating to relocate all the things I loved best. Or have to spend energy doing google searches and reaching out to people to find them. Some things didn’t transfer properly and figuring out the fixes is also time consuming.

I’m sure the new changes will ultimately lead to a better working experience. But right now, I’m feeling slightly disappointed that all this new, fancy technology can’t help guide me better in the early days. Perhaps my expectations for how and where technology will help changed with the introduction of AI. I’m still waiting for an update on the updates.

Customization Expectations

As part of the renovations we needed new cupboards in the pantry area. The previous shelving had been hastily built against an unfinished wall complete with nails jutting out in awkward places. Consequently, anything pushed too far back on the skinny shelves plummeted to the floor. The nails added a touch of hazard and excitement to the otherwise boring task of retrieving something from the basement.

We wanted something adjustable and decided on IKEA. We felt this would offer us options without being too expensive. I hadn’t purchased IKEA furniture in years. There have been many changes since my last visit. Though IKEA has always been a leader in offering customizable options, it’s at a whole new level now.

Browsing on the IKEA website, I was surprised to see options for designing your own storage units. Options for designing a bathroom, kitchen, or other living spaces were also handy. When I purchased PAX units to make built-in closets about 7 years ago, I probably went to the store to see floor models. Now, you can design everything from home without making a trip, or 20, to IKEA.

Customizing units is easy in the design feature. As well, adding room measurements is also quick and easy. Swapping out colors and interiors takes only a few minutes. Rotating or moving units to see different placements takes a few mouse clicks. There are even built-in design rules that flag flaws. For example, in one design, there wasn’t enough space between the cabinet doors and the walls. On paper, it seemed that everything would fit based on the measurements. But likely, had I built it that way, the doors wouldn’t have opened properly.

Picking up the order at IKEA, I had another pleasant surprise. Near the warehouse area, I used one of the available computers to enter in a code for my design. Instantly, a check list appeared on my phone with locations and amounts for the items. However, that’s as far as the customizing went.

Arriving home, I felt slightly disappointed that IKEA didn’t create a customized manual for me to build my design. Instead, I fumbled through the printed manuals figuring out how to build a multi-cabinet unit. I also felt annoyed with some of the waste. Many pieces didn’t get used because they weren’t part of the design. The levels of customization we experience daily alters our expectations for everything now.

Externalized Memory

Growing up I always asked my dad, “how do you know that?”

A lot of times he replied because he was there. That he had lived through it or he just remembered. But to be honest, it was mostly because my father had an exceptionally good memory. The kind of internal memory that some people rely on, or used to rely on, to get around, do jobs, and tell stories. Now, we instantly externalize our experiences with our devices. This changes how we use our memories. We’re out of practice.

I’ve also noticed my memory declining over time. Though getting older is definitely a factor, it’s also because I don’t need to use it as much anymore. Anything I can’t remember instantly is a prime candidate for googling or digital recall in mere moments. I make a point to remember key phone numbers, but I think most people rely on their phones for this now.

The speed with which search results appear also impacts our expectations for our memories. Although it’s frustrating if I can’t remember something instantly, I challenge myself to stretch my internal memory muscles. I find if I can relax, resist the temptation of a device, and try to follow the chain, usually I’ll remember. Sometimes it takes a while. I think this is partially because I’m always distracted by my phone, trying to do more than one thing at a time. This is proven to impact memory and quality.

Another memory impact is the amount of content exposure. For example, I consumed a short mystery series in a week. Though only six episodes long, I had trouble recalling the introduction of certain key clues to the story. Since I watched it in a few sittings there wasn’t time to digest and reflect in between episodes. Growing up, we had to wait an entire week for a new episode. Many people also watched the same thing providing ample opportunities to discuss, chat, and socialize along with the watching.

With the advent of AI, we’ve moved away from not bothering to memorize to not even learning anymore! When I use AI to help me start a draft of something, it’s because I already know how to do it. I’m saving time with this shortcut. Today’s students are learning the art of the prompt rather than how to properly structure essays, create outlines, and formulate arguments.

The Skunk and the Samaritan

About a week ago I scratched my cornea helping to clean my neighbor’s front yard. She’s older and I noticed some pulled plants drying on the front lawn. After bagging those, I was pulling some dead matter from the yucca. While reaching for that last bit, poke! Luckily, not serious.

The following week, when a skunk got trapped in her window well, I had some serious reservations about how far I would go to assist. I felt bad for the animal, but still… skunk spray! After confirming the skunk was, indeed, trapped in the window well, I tried to adjust a basket somebody else had put there. I think it was for the skunk to climb out. However, every time I tried to get close, the skunk pointed her bum at me aiming to spray.

Then I called the City to report the trapped animal. It was a Saturday so I wasn’t sure how long it would take for someone to come. In the meantime, I googled about skunks to give her some food. Learning they eat nuts and fruits, I gently tossed walnuts and a plum into the window well.

The skunk noisily slurping on a plum. She was definitely hungry! While distracted, I flipped the basket over.

By this time, the City came. Within minutes the animal was freed! I watched the whole rescue. Maybe the most fascinating part was that the city worker only wore a regular pair of latex gloves. No special clothing, no face shield, nothing protective really. The skunk sprayed a lot during the rescue filling the air with noxious, pungent fumes. But luckily, I avoided a direct hit.

The City worker scaring the skunk away after the daring rescue. She was ready to spray again. Notice the healthy looking, eye-stabbing yucca in the foreground.

What struck me most about this whole situation was that people knew the skunk was trapped in the window well on Thursday. Somehow, I didn’t get this vital piece of information until Saturday. Though I kept wondering why the skunk smell was so strong near the backdoor for days! Once I found out, I immediately called the City to report the trapped animal.

Even if somebody didn’t know to call the City, which I’m assuming nobody did, isn’t this what the internet is for? To connect people to resources? To answer those perplexing questions for an uncommon, surprising situation? And yet, nobody did. Even more puzzling than how the skunk ended up in the window well for over two days, was trying to understand why nobody could connect technology to nature and help this poor animal.