Now put this capability in the hands of a government intel analyst (in either country) who does not have nearly the same level of expertise and cultural awareness of anybody at ChinaTalk, and it's not hard to project very bad outcomes. Between "average of averages," janky single-source references, poison fountains, various other deliberate efforts to corrupt data and sourcing, lack of training on the capability, etc. etc., you have a recipe for disaster. That's not a pre-ordained future. It just means education & training (plus human judgment) become more important than ever, along with demanding the same level of analytic rigor people expected in the pre-AI days.
It is not only paywalls, but the robots.txt files on web sites. The web fetch of Claude honors those files, and many sites simply block AI traffic, or Anthropic's traffic, in their robots.txt. This holds for example to main Finnish news sites, including the big one funded by the government.
Among AI bots, Grok seems to be an exception here. Its default harness at X seems not to care about robots.txt at all.
The types of analysis you’re looking for sound like they’d benefit from writing a custom skill. I imagine any skill for foreign policy should establish a basic landscape of information and what level of verification we want.
Now put this capability in the hands of a government intel analyst (in either country) who does not have nearly the same level of expertise and cultural awareness of anybody at ChinaTalk, and it's not hard to project very bad outcomes. Between "average of averages," janky single-source references, poison fountains, various other deliberate efforts to corrupt data and sourcing, lack of training on the capability, etc. etc., you have a recipe for disaster. That's not a pre-ordained future. It just means education & training (plus human judgment) become more important than ever, along with demanding the same level of analytic rigor people expected in the pre-AI days.
jack when are you gunna come on second breakfast to shoot the shit
When I feel like I can hold up my end of the bargain as compared to all your other superstar guests!
It is not only paywalls, but the robots.txt files on web sites. The web fetch of Claude honors those files, and many sites simply block AI traffic, or Anthropic's traffic, in their robots.txt. This holds for example to main Finnish news sites, including the big one funded by the government.
Among AI bots, Grok seems to be an exception here. Its default harness at X seems not to care about robots.txt at all.
The types of analysis you’re looking for sound like they’d benefit from writing a custom skill. I imagine any skill for foreign policy should establish a basic landscape of information and what level of verification we want.
Interesting exercise and I enjoyed your evaluation of it!