It’s been over a year since The Messenger closed down and the lessons are even more stark today. The meltdown precipitated from a number of out-dated expectations. Social media algorithms are shunning general news content, Google has been re-calibrated to favor original, niche content and news consumption habits have changed considerably.
It’s kind of like running a restaurant on a busy street. If your goal is to touch as many consumers as possible (as The Messenger sought to do) the easiest way to achieve that is to make food quickly and throw it at passing cars hoping that people will 1.) have their window rolled down, 2.) will be hungry and 3.) will like what you’ve made. If your goal is to build an enduring business with a foundation of repeat customers your best bet is it invite them in, speak with them and hope they stay a while.
News articles are a no longer a commodity that is profitable at scale. The age of infinite personalization (via AI) is upon us. Publishers that understand Liquid Content, information that can be adapted to it’s consumer, will prevail. The new moat will be unique archives of well-formatted content, rich with meta-data stored in dynamic data structures that can re-combine on-demand to meet a specific need. Adaptability is key because in this new world it is impossible to know if your growth will come from text snippets, short-form vertical video, interactive graphical representations of data, or audio summaries translated to Chinese.
In this world, the inverted pyramid is but one way to present what you have.
I went to a panel yesterday where online retailers spoke about the changes AI has brought to their profession. Ugly Talk: Selling in the World Run by Algorithms. While a well-structured website with lots of meta-data around your catalog have become table-stakes, success or failure can also come from unexpected data layers in your checkout pages. Frank Pacheco from Nearly Natural spoke about how one of his SKUs sales dropped 75% overnight because of a shipping issue which extended delivery times beyond the usual Amazon two-day. Your supply chain and logistics is just as important to agentic commerce as your descriptions and prices.
As AI Agents and atomization of audiences into niche vertical markets will de-emphasize traditional marketing, social media still remains as an important marketing channel. As marketers try and optimize their sites to get mentioned in the AI Answer Engines, social media is one of the last resources to discover broad trends and what people are looking for in your product.
In this sense, social media marketing has become mainstream and often you’ll see traditional advertising campaigns shot to look like low-budget social media clips to try and emulate the unfiltered and honest perspective of a viral video. It rarely works as audiences can see right through that.
One of the first rules of social media marketing is that you should have faith in your product such that you can allow the customer to amplify your products and brand and remix and celebrate it.
Having your CEO reveal your latest “product” before anyone else can experience it is the antitheses of social media. The bland office, the tiny, tentative bite, not even mentioning that this behemoth has THREE slices of cheese, even the sign over his shoulder that says “petty” – these are all working against him.
That is, unless the goal is to generate mentions on reddit in which case, they win!
Anthropic, makers of AI Assistant Claude, is going after their competition with a series of ads to designed to give users pause after the Open AI’s announcement that ads would be coming to their product, Chat GPT.
The campaign, Time and a Place, was envisioned by Mother London, extending Claude’s positioning from the Keep Thinking campaign.
The ads launched yesterday, timed for this weekend’s Super Bowl, which will feature two of the spots as detailed by AdWeek.
First, the good part of the Anthropic ads: they are funny, and I laughed.
But I wonder why Anthropic would go for something so clearly dishonest. Our most important principle for ads says that we won’t do exactly this; we would obviously never run ads in the way Anthropic…
Tried something different this year, asked ChatGPT to summarize the year for me.
One benefit of having the entire archive of this blog part of OpenAI’s learning model is that it can spit back what it’s learned, not only in substance but also in tone. It’s uncanny.
Here’s how ChatGPT responded:
Absolutely — based on Ian Kennedy’s actual writing on everwas.com (which covers tech policy, AI’s impact on the open web, local NYC issues, and personal essays), a hypothetical 2025 Year-in-Review blog post from him would blend thoughtful analysis of tech trends with sharp local and cultural commentary.
Here’s how Ian Kennedy of everwas.com might write a “2025: A Year in Review” post:
2025 felt like the year the internet finally stopped pretending it was the open playground of ideas and began behaving like what it’s actually become: a contested battleground where ownership, access, and value are constantly being negotiated — not just by corporations, but by governments, communities, and the technology itself.
AI Changed the Web — But Not in the Way We Expected
At the start of the year, conversations around artificial intelligence were dominated by shiny demos and sci-fi ethics. By year’s end, the conversation shifted to who owns the data and who gets paid for it.
Policy makers, IAB task forces, and publishers around the world began framing the debate in terms of rights and compensation, not just innovation. The old model where content was indexed freely by search and repurposed by AI is now being replaced by a future where:
Content is tokenized and monetized rather than scraped for free
This wasn’t just theory — it became a reality for countless newsrooms and creators struggling to survive in an era where AI can answer queries without sending anyone to click a link.
Cloud Infrastructure vs. the Power Grid
Machine learning isn’t just a software problem. The massive energy demands of data centers pushing GPUs to their limit became a real infrastructure challenge this year. We saw:
Idle data centers waiting for electricity capacity to catch up
Real worries about quality of life when computing giants annex local power grids
The future of “compute anywhere” suddenly looks a lot more tied to where the electrons actually come from.
New York Showed the Rest of the Country
Living in this city means Local often becomes National. In 2025 New York city was a microcosm of the larger cultural and political shifts:
A mayoral election turned on cost of living and democracy reform
Community voice battled casino development bids on the East River
Neighborhood tech events and decentralized meetups flourished alongside big conference circuits
The city didn’t just make headlines — it embodied the tension between grassroots citizen engagement and centralized power.
Tech Week, Brand Conversations, and the (Un)Informed Public
2025 also saw a subtle but unmistakable shift in how technology and culture engage:
The growing success of events such as NYC Tech Week and Tech:NYC AI Demos which felt less like corporate showcases and more like community patchwork gatherings
Branding in an AI-driven world became less about identity and more about trust and permission
Conversations about Google, Perplexity, and discovery technology moved from technologist circles into mainstream debate
It’s hard to overstate how much that matters. When ordinary people begin arguing about the future of search and discovery, the underlying structure of the web itself has already changed.
Personal Notes from the Year
Beyond the geopolitical and technological, 2025 was also a year of small perspectives — hikes, travel, and moments that reveal a deeper truth about why culture still matters even when platforms shift.
Whether it was a hike near Kamakura or a rumination on a slow waterfall (ed. this is a pretty funny interpretation of this post), these personal essays reminded readers that tech is not just code — it’s lived experience.
Looking Ahead to 2026
If 2025 taught us one thing, it’s this:
The open web is not going quietly into the night — it’s being re-defined.
Access now has a price tag, and that price is up for negotiation.
The people using the technology are just as important as the companies building it.
The next year won’t be about whether the web changes — it’ll be about who gets to shape that change.
Here’s to the debates, the fights, and the small stories that make it all worth caring about.
I was fortunate to be invited last week to the kick off meeting for an IAB Tech Lab task force dedicated to establishing a framework for Monetizing the Open Web in the age of AI.
The accelerated use of tools such as OpenAI’s Chat GPT, MSN’s CoPilot and Google’s AI Overview has precipitated a re-thinking of how publishers are compensated for their work. While the conversation is only just beginning, this group outlined concrete suggestions to the challenges ahead.
Publishers must work together to stem the flow of unlicensed content.
As long as information is readily available and free, there is no incentive to drive demand. Unless you limit supply, you cannot derive value. Content Delivery Networks (CDN) such as CloudFlare and Fastly are on the front lines as they see the majority of requests & responses across their networks. They have noted the sharp increase in AI-bot traffic and have the expertise in the blocking and tackling of the thousands of bots and spiders.
TollBit is keeping track of the AI crawlers with their quarterly State of the Bot reports
Publishers in the room are anticipating a world where the value of a search result on Google is less than the value of licensed access from an AI Agent. In such a world, it would be prudent to default to denying access to all crawlers in favor of direct access by verified, registered readers or licensed partners.
Old methods such as robots.txt and user-agent/IP address blocking filters are readily ignored or spoofed by long-tail startups scrambling to serve their users. More secure methods are necessary. “Get a better lock on your door.” says Jon Roberts, Chief Innovation Officer at DotDash Meredith.
Establish a marketplace for publisher information that can scale to serve AI Agents.
While CloudFlare’s Cost-Per-Crawl implementation redirects crawlers to a 402 HTTP error page to redirect AI developers to licensing information, TollBit and Dappier are building the first marketplaces for publisher libraries as a proof-of-concept as a bridge from the search engine results of the past to the marketplace of the future.
A search for the past 25 years returned 10 blue links of which a human might read 1 or 2 links. This referral traffic was the currency of the old internet. This attention could be monetized any number of ways.
Google owned this marketplace.
Now imagine a world where a query by an AI Agent may spawn 20-50 queries of which *every* article is scanned and synthesized into a single response. Ads and subscription funnels on these pages are ignored. There needs to be a different pricing model for this traffic. Source material will not be a “page” but could be a snippet of video, a schematic blueprint, a sound bite, or a product spec. The pricing model in this “marketplace of everything” needs to recognize and support dynamically pricing requests based on who is requesting it and in what format, all in real-time.
The programmatic advertising ecosystem which has been the engine driving the growth of online publishing for the past 25 years was subsidized by advertisers bidding for a reader’s attention on a web page. An entire tech-stack was built to serve up the right ad for the right audience at the right time, all in under 500 ms.
With AI Agents, you have readers paying directly for content with their subscriptions (Chat GPT Pro is $200/month) and the AI Agent is acting as a proxy on their behalf. Once publishers have successfully shut off the free flow of their content, accurate, reliable, and up-to-date information will accrue value. In this world, there will be need for a real time marketplace to handle the access and metering of content and this system will need to be built to the same scale and performance of the programmatic advertising platforms of today.
No one owns this ecosystem.
Tokenization – core components to a market
The final step, once you have established a way to meter (cost-per-crawl, cost-per-query or some other subscription model) access to a publisher’s library, is to establish a standardized way to track and count content as it travels through the ecosystem from a publisher to the AI platform.
Tokenization is this final step. Once an AI has asked for something, the response needs to be tokenized in such a way to properly attribute credit as well as track the usage of that content not only to the initial query but in all future derivative uses.
ProRata has a working implementation of this tracking in their Gist Answers, distributed search widget. “If you can track it, you can charge for it.” they say as their IP is focused on the attribution of AI responses from their network of publishers.
TollBit has one-time use tokens for access and are setting up a system where an AI Agent can query for information, inquire about pricing, and generate and receive a token to retrieve the information as needed, on demand.
All these are approaches to the same problem and my company, SimpleFeed, aims to participate in the delivery of publisher content, whether it be tokenized, vectorized (to assist in discovery), or otherwise value-added with filtering and meta-data augmentation as we have been doing for 20+ years.
I look forward to staying engaged with the individuals and companies that were gathered for this workshop. AI summarization is rapidly tearing down a business model that has worked for decades. Unless there is an agreed upon business model that is accessible to all players including small publishers and long-tail AI startups, we may lose the diversity of opinions and perspectives that have given us the open web we currently enjoy.
It makes sense that IAB, an industry group that helped establish standards around online advertising, is taking the first steps to establish standards around the AI Agentic web of the future. I thank them for taking this first important step of getting all the players in the room together and publishing the first framework for publisher content monetization and brand content management for LLMs and AI agents.
Publishers underwrite new projects based on forward-looking financials. If nothing is done to improve the economics of publishing online today, the investigative reporting of the future will not be funded and we will all be poorer for it.
Cloudflare sees about 20% of the world’s internet traffic cross its network. Matthew Prince, CloudFlare’s CEO, has a pretty good handle on trending internet behavior.
In the video below, he summarizes the impact of AI “answer engines” that summarize what they find in response to questions put into AI chatbots such as Google’s AI Overview or Open AI’s Chat GPT.
What he concludes is that the exchange that allowed the search engines to index a site in return for referred pageviews is no longer equitable and is growing increasingly lopsided.
The data, according to Cloudflare;
10 years ago: Google crawled 2 pages per visitor
6 months ago: Google 6:1, OpenAI 250:1, Anthropic 6,000:1
Present: Google 18:1, OpenAI 1,500:1, Anthropic 60,000:1
No one is saying is what we all know in our hearts. No one is clicking through to the footnoted sites on an Answer Engine. The AI crawlers are taking more than they are giving back.
It’s worth watching the entirety of Matthew’s talk in context of an announcement he references in the video. I expect it will be something related to their Bot Management suite. But I also look forward to hearing more about CloudFlare’s vision to remake the internet into one that rewards knowledge creation.
While the introduction of AI Answer Engines is a current and immediate threat to the online publishing ecosystem as we have known it, there is a nascent opportunity to re-build the internet on a new ecosystem built around different incentives.
Slide from a presentation on rebuilding an internet that rewards knowledge, not attention
What if we rebuilt the internet into one that rewarded Knowledge and not Attention. The “attention economy” brought us clickbait and distraction. What if we rebuilt the internet into one that rewarded publishers that gave more than they took away? Am I naive?
I am looking forward to learning more about CloudFlare’s plan.
Back then, pundits were concerned Google was taking on a whole host of potential copyright-infringement lawsuits as YouTube was chock full of pirated videos. YouTube has since built a sophisticated copyright detection algorithm that does a pretty good job of detecting not only pirated videos but also when copyrighted music is used as a video soundtrack.
Could the same be said for the potential acquisition of Perplexity? While all LLMs share an index made up of the common crawl and anything else they can find on the open internet, will Perplexity’s vectorized index of exclusively licensed news sources drive enough usage and value to a potential acquirer?
While YouTube’s pre-acquisition copyright concerns ended up being nothing more than a speed bump, it eventually started a formalized conversation around formally licensing content. Could Perplexity’s fledging licensing program be the start of a more sustainable way grow the new AI ecosystem?
A couple of years back, I was working with a publisher that complained that their site was really slow. As I opened up the site on my browser I immediately became suspicious as the fan on my MacBook kicked in and every process slowed to a crawl.
When I finally was able to crack open and see what was going on behind the scenes, I was able to record what happened on each page refresh.
In honor of Gene Hackman, I re-watched his classic role as the gritty NYC detective, Jimmy “Popeye” Doyle in the 1971 film, The French Connection. As any New Yorker will know, many of the locations in the film are familiar but the passage of time has made it harder to pinpoint exactly where.
Enter Perplexity Pro. Here is the first of what I think may be a series of AI-augmented blog posts called, “Cool things I asked AI.”
What follows is the output from Perplexity from my prompt
Where in NYC was the French Connection filmed?
The 1971 film “The French Connection,” directed by William Friedkin, is widely celebrated for its authentic portrayal of New York City’s gritty landscape during the early 1970s. This groundbreaking crime thriller, which won five Academy Awards including Best Picture, utilized numerous locations throughout the five boroughs to create its distinctive documentary-like atmosphere. Before beginning my detailed exploration of the film’s New York City locations, it’s important to note that “The French Connection” stands as one of the most ambitious location shoots ever undertaken in the city, utilizing 86 separate filming sites and capturing a New York landscape that has substantially changed over the subsequent decades.
Manhattan Locations
Manhattan served as a critical backdrop for many pivotal scenes in “The French Connection.” The film effectively utilized various neighborhoods to establish the urban landscape against which the drug trafficking investigation unfolds. One notable location was Le Copain restaurant at 891 1st Avenue, where Charnier dines during a key sequence in the film. The busy streets of midtown provided several important settings, including the scene where Doyle waits outside a restaurant near 50th Street and First Avenue.
The affluent Upper East Side was featured prominently, with Weinstock’s apartment filmed near 82nd Street and Fifth Avenue, close to the Metropolitan Museum of Art. This upscale neighborhood also housed the Westbury Hotel at 15 East 69th Street, which served as Charnier’s accommodations in the film. Another significant Upper East Side location was Ronaldo Maia Flowers at 27 East 67th Street, where the clever Charnier gives Detective Doyle the slip in one of the film’s most tension-filled moments.
The Roosevelt Hotel at 45th Street and Madison Avenue appears in a critical scene when Detective Russo first encounters “Frog One,” Alain Charnier. Grand Central Terminal also features prominently, with the 42nd Street Shuttle platform serving as the setting for a tense pursuit sequence. The Lower East Side, with its unique character and gritty atmosphere, provided several important locations, including Ratner’s Restaurant at 138 Delancey Street, a famous Kosher deli where Sal and Angie emerge during surveillance. According to location historians, Ratner’s was once a hangout for gangsters Meyer Lansky and Bugsy Siegel before it finally closed its doors in 2002.
Additional Manhattan locations included scenes shot in Little Italy, specifically at the 177 Mulberry Street drop point, and the foot of Manhattan Bridge at South Street and Market Street, where Doyle is shown emerging from a bar in his disheveled state.
Brooklyn’s Starring Role
Brooklyn provided some of the most memorable locations in “The French Connection,” most notably the setting for what many film historians consider one of the greatest car chase sequences in cinematic history. This iconic chase was filmed in Bensonhurst, Brooklyn, running under the BMT West End Line (then the B train, now the D train). The sequence specifically follows Stillwell Avenue, continues along 86th Street, and then turns right onto New Utrecht Avenue, culminating near the 62nd Street station. This complex action sequence took an astonishing five weeks to film and has become an enduring piece of cinema history.
Beyond the chase, Brooklyn offered numerous other locations that contributed to the film’s authentic atmosphere. Bushwick provided the setting for Sal and Angie’s Cafe at 91 Wyckoff Avenue. The now-demolished Rio Piedras theater at 914 Broadway on the Bushwick/Bed-Stuy border served as the location for the opening Santa Claus chase sequence. According to historical records, this was originally the Loew’s Broadway, built in 1904 with seating for 2,000, before being torn down in 1988, leaving a vacant lot in its place.
The Marlboro Housing Project, located near Avenues V, W, and X off Stillwell Avenue, was used to represent where Detective Popeye Doyle lived. Other significant Brooklyn locations included Columbia Heights at Squibb Hill, where Sal parks the Lincoln automobile that contains the smuggled heroin, and Remsen Street, where Charnier and Nicoli observe the car being unloaded.
Queens, The Bronx, and Island Locations
While Manhattan and Brooklyn received the most screen time, “The French Connection” utilized locations throughout the city’s other boroughs as well. In Queens, several scenes were filmed in the Ridgewood neighborhood, specifically along Forest Avenue and Putnam Avenue, as well as Onderdonk Avenue. The film also shot scenes in Maspeth, creating a comprehensive view of the borough.
The Bronx was represented in the film through scenes shot in the Hunt’s Point area, though these locations weren’t as extensively used as those in other boroughs. Two island locations proved critical to the film’s narrative: Randalls Island in the East River appears in several sequences, while Wards Island served as the setting for the film’s climactic final shootout.
The Triborough Bridge to Randall’s Island toll bridge at the east end of 125th Street was used for scenes where Sal Boca is tailed by the detectives. This infrastructure element helped connect the film’s various geographical components while providing a stark visual backdrop for the unfolding investigation.
Historical Context and Changes
“The French Connection” captured New York City during a particularly challenging period in its history. As noted by filming location experts, the city’s decay is front and center in nearly every frame, from “abandoned, grime-covered buildings and derelict cars to crumbling warehouses and trash-strewn lots”. This visual representation of urban decay provided an authentic backdrop for the story’s dark themes.
Producer Philip D’Antoni faced significant logistical challenges during production, as the film was shot primarily during New York’s unpredictable winter months of December through February to match when the actual events occurred. These weather conditions added another layer of difficulty to an already complex location shoot but contributed significantly to the film’s realistic atmosphere.
It’s worth noting that the film provides a fascinating historical document of the city’s appearance during this period. Some shots even captured the World Trade Center under construction, with the completed North Tower and partially completed South Tower visible in background scenes at the shipyard after Devereaux arrives in New York.
Over the decades since filming, many of these locations have undergone significant changes. The beautification of New York City over the past 25 years has come at a price, with many classic New York establishments and mom-and-pop stores replaced by more generic businesses. The gritty New York captured in “The French Connection” largely exists only in films now, providing an important historical record of the city’s visual character during the early 1970s.
Conclusion
“The French Connection” stands as a landmark achievement in location filming, utilizing the authentic landscapes of New York City to create a visceral, documentary-like crime thriller. The film’s extensive use of real locations throughout Manhattan, Brooklyn, Queens, the Bronx, and the city’s islands provided viewers with an unvarnished look at urban America in the early 1970s. From the iconic car chase beneath Brooklyn’s elevated subway lines to tense surveillance sequences in Manhattan’s diverse neighborhoods, the city itself becomes a central character in the narrative.
Beyond its cinematic achievements, “The French Connection” serves as an important visual document of New York during a specific historical period, capturing landscapes and establishments that have since been transformed or disappeared entirely. The film’s commitment to location shooting contributed significantly to its enduring impact and continues to influence filmmakers seeking to capture authentic urban environments on screen.