It fumbled my Pacific Northwest trip
It fumbled my Pacific Northwest trip
Homepage   /    travel   /    It fumbled my Pacific Northwest trip

It fumbled my Pacific Northwest trip

🕒︎ 2025-10-30

Copyright The Boston Globe

It fumbled my Pacific Northwest trip

That said, I am not so sure about how much it can act as an adviser. I say this as an advice columnist who’d like to keep her job. I also say it after watching my sister use it for medical advice. Sometimes her ChatGPT “assistant” is correct — and helpful. Until it isn’t, and it’s just mirroring whatever she tells it to think. This is what happened when I tried to use ChatGPT to design my vacation to the Pacific Northwest, a region of the country I’d never seen. Googling a bunch of tourist sites was proving to be confusing. I didn’t want to use influencers to help — because seeing too many Instagram reels about a spot can lessen the effects of experiencing those landmarks in the moment. I wanted to go to these places unspoiled. Many months ago, I asked ChatGPT — as an anonymous user without a profile — to plan a trip from Portland, Ore., to Seattle, stopping at Olympic National Park. Then I admitted my main goal for the whole trip: I wanted the vacation to be built around sites from the 1985 film “The Goonies.” Basically, as a Gen X, pop-culture-loving tourist, I wanted a “Goonies”-forward vacation, but one that also gave me a sample of the natural wonders of Oregon and Washington. What I got back from ChatGPT, at first, was excellent. Seemingly. “Here’s a fun and scenic 7-day vacation itinerary from Oregon to Seattle, stopping at Cannon Beach, Astoria for Goonies filming sites, and other great coastal and city spots. It mixes nature, nostalgia, food, and urban fun.” (ChatGPT loves to bold things.) Great. But then I noticed: Day 1 didn’t leave much time for picking up a rental car. Also, it had us stopping by multiple waterfalls. Maybe because I mentioned waterfalls. “Aren’t a lot of waterfalls, like, the same?” I asked it. ChatGPT answered by saying that yes, I was right. Of course I was right! Did I want a new itinerary with only one waterfall? “Yes,” I said. It gave me that. Perfect. But then I remembered that my friend Jenn told me there’s great wine in the Willamette Valley. Isn’t that close to where we’d be? “Can we add in a fantastic winery in the Willamette Valley?” I asked ChatGPT. Of course. What a good idea, it told me. It added a winery, scheduling the stop between Astoria and our drive to Washington. The winery visit would add a few hours, of course, but ChatGPT seemed to think I’d made a great decision to put this in my journey. Then I made the mistake of asking, “Is there anything gorgeous I’ve missed? Any site I’ll regret not seeing?” It gave me a list of possible extras, including Thor’s Well, also known as the “Drainpipe of the Pacific.” It’s basically a big hole at the edge of Cape Perpetua that looks astounding in pictures. I began mistakenly calling this natural wonder “Thor’s hole,” and told my partner I was desperate to see it. “We are going to see Thor’s hole!” ChatGPT had gotten me so excited. Then I looked at the driving time. “Wait,” I asked ChatGPT, “Is Thor’s Hole totally inconvenient based on the fact that I want this vacation to give me ample time to see the ”Goonies" stuff — which was the whole point in the first place?" It agreed with me. I’d asked it for too much. That’s when I realized my wishes had spun out of control, mainly because ChatGPT is trained to validate me. For every idea I had, it basically said, “What a fantastic addition to the mix!” If I told it I worried I had packed too much into the trip, it agreed with that, too, telling me I was probably right. Over a few weeks, I made about 14 itineraries. It was around this time, by the way, that I also learned the environmental impact of all of this. Oh no. When I searched the effects of using ChatGPT on the environment — the AI-written summary told me a ChatGPT search session uses five times the energy of a regular web search. I traced that to a story about this from MIT. Reading the numbers, I felt guilty and stopped, calling my last itinerary the final one, even if it was a mess. I went to Reddit for tiny questions. A few weeks before I left, I saw a work friend, Malcolm. I told him about my upcoming trip, and he told me something I didn’t know about him — that he visits the Pacific Northwest with extended family. Very quickly he told me two things that ChatGPT did not: First, Sequim, Wash., where I had booked an Airbnb for the Olympic National Park part of the vacation, is pronounced Squim. (That saved me a lot of embarrassment with locals on the trip.) Second, he told me that Sequim is the lavender capital of the country. ChatGPT did not tell me this. Because I didn’t ask. Malcolm knows that I’m not an outdoorsy person, and that I have bad allergies (I am constantly blowing my nose next to his desk), but there are acres of lavender everywhere in Sequim, he said. It’s worth viewing the beauty and spectacle. Malcolm told me about places that sell lavender by the bunch and serve lavender ice cream. He said that it isn’t very good and kind of tastes like soap, but that I should try it. I assume Malcolm also knows I’m kind of lazy, so he helped me prioritize a few stops on my itinerary, based on what he knows about my tastes. Yes, the Hoh Rain Forest is worth it, he said, and he told me to block out a whole day for it (ChatGPT had told me to do five other things on my Hoh day. That would not have been possible). It was Reddit that confirmed that piece of advice — real people explaining how they went on the Hoh journey, advising me to wake up at the crack of dawn to beat the long lines to get in. (For the record, I recommend arriving at the Hoh by 7 a.m. It was peaceful at that hour, and I didn’t have to waste a single minute waiting in line.) I’ll admit that ChatGPT got me the basics. It helped make a list of “Goonies” sites, although I’m sure a basic Google search would have worked, too. The more important part of this learning experience was that when I returned from my Pacific Northwest journey, I was able to tell Malcolm that I loved the lavender farms, and that I did eat the ice cream, and that while it was soap-like, it was still a nice snack. I hadn’t bonded with Malcolm in a while, actually, because we’re busy and out of the office a lot. Not to be cheesy, but that’s part of it too, right? ChatGPT was pretending to be my friend. But Malcolm actually is. One reason we travel is to connect with others when we get home. I was happy to do that.

Guess You Like