Gun shopping has become a mental health crisis, as people irrationally stockpile bullets and automatic weapons they have no reason to own (and many reasons to not own).
They say Californians like “King” size beds so they can fit more assault rifles under them. Not even kidding.
Presumably the number of guns hauled here for photos correlates directly to a frquency of NRA-built loopholes this felon found while gun shopping.
A former elite gymnast who claims he “tracked ISIS recruits with Palantir in Syria” now wants to be the definitive judge of “where criminals come from” in America. Let’s talk about what his Peregrine-themed company really means…
Adrenaline. Not calm and reasoned thought. Not justice and transparency. Adrenaline.
His Peregine code has leaked, revealing the same dangerously flawed logic as used in Iraq, Afghanistan, Syria… to create ISIS.
def surveillance_feedback_loop(population):
while True:
# Step 1: Deploy mass surveillance
# Just like COINTELPRO watching breakfast programs
targets = identify_potential_threats(population)
# Step 2: Create pressure on communities
# See: Every failed counterinsurgency campaign ever
surveilled_groups = apply_monitoring(targets)
To put it plainly, there were four Palantir profit models used abroad to increase dangers, which are being transferred domestically to America by a spin-off called Peregrine.
Initial Deployment
Claimed: “Identify ISIS recruits”
Actually: Targeted entire communities
Result: Created collective punishment… PROFIT!
Community Impact
Claimed: “Prevent radicalization”
Actually: Fragmented social structures
Result: Increased isolation and alienation… PROFIT!
Threat Generation
Claimed: “Reduce extremism”
Actually: Created conditions for recruitment
Result: Generated the terrorists they claimed to be tracking… PROFIT!
System Response
Claimed: “Adapt to threats”
Actually: Expanded targeting
Result: Accelerated the cycle… PROFIT!
The playbook is clear:
In Syria: Label communities as “ISIS suspects”
In America: Rebrand activists as “extremists”
The result? Same as Nixon’s era, just with AI-generated PowerPoint slides.
The rising cost of floggings will continue until morale improves.
Peregrine isn’t just replicating an abroad failed system domestically. They’re replicating a system that succeeded at:
The Syria deployment revealed an unstated goal of expanding surveillance infrastructure through threat generation.
A User’s Guide to Hidden Success in Overt Failure
Hey Peregrine people, let’s talk about this exciting new “integrated law enforcement platform” vacuuming up citizen data across the nation like Woodrow Wilson’s nationalization of telephone lines to disenfranchise his opponents (e.g. non-whites, labor unions) from government.
But first, a quick pop quiz about how integrated law enforcement platforms work in practice:
Q: The FBI had MLK under comprehensive surveillance for years. How’d that work out?
A: They recorded his conversations, tracked his movements, infiltrated his organization… and completely failed to prevent his assassination. But hey, they did manage to send him a letter suggesting he commit suicide, so there’s that “data-driven impact” for you.
Speaking of surveillance and psychological manipulation, do you remember when Russian intelligence surveilled Olympic athletes and sent them targeted messages to destabilize their mental health and knock them out of competition?
In 2016, Russian military intelligence selectively leaked medical records and sent personalized messages to athletes, trying to push them to mental breakdown or even suicide.
Can your system yet convince people to kill themselves? Russia really wants to know.
They’ve already demonstrated how surveillance plus competitive targeting equals psychological warfare. I mean I’m very sure a hyper-aggressive competition-minded gymnast’s “integrated law enforcement platform” would never be used in such a way that we already have seen over and over again…
Greatest Hits of Peregrine Predecessors
Let’s review some other spectacular examples of safety technology succeeding at being used for oppression while failing at… safety:
COINTELPRO (1956-1971): Meticulously documented Black Panthers feeding children breakfast
SHAMROCK (1945-1975): Read millions of telegrams to harass civil rights leaders
Today: Exxon contractors hack climate activists. Because nothing says “public safety” like targeting people trying to prevent planetary disaster.
Technology Changes, Patterns Don’t
The Palmer Raids (hat tip to a comment by “Not Nick Noone“) used then-cutting-edge telephone surveillance and census data to round up political dissidents while actual bombers remained at large. Sound familiar, Peregrine?
Remember when IBM’s punch cards made the Holocaust more “efficient”? But don’t worry, this time the data lives in micro services not punch cards, so it’s totally different.
Austria’s census data enabled perfect targeting of Jewish communities. But hey, at least their data integration was on point.
Syria’s surveillance infrastructure came with great Western tech support. Those monitoring centers had excellent uptime!
Peregrine’s Innovation in White-washing
Now here’s Peregrine, with glossy brochures selling a shiny future of policing like nobody remembers:
Multi-agency data sharing (like Operation MINARET’s illegal intel sharing)
Automated targeting (like Japanese internment’s IBM cards)
Watch the Peregrine CEO, a former elite gymnast, start to tell press that he really wants to push beyond rushing criminal convictions through courts and into becoming the definitive judge of all societal ills, defining “where criminals come from“…
I try to have a lot of adrenaline for the competition.
Historians, this is your cue. We’ve studied this many times before, and it never ends in happy ever after or “and then everything was fine.” A former gymnast honed in adrenaline-fueled competition… applies his training and “embedded” militarized experience into mass domestic surveillance and predictive policing. What could possibly go wrong?
Notice the tired and sad pattern here? Every single adrenaline-driven “innovative” surveillance system excelled mainly at controlling targeted populations to “win” in rushed competitions, while failing at its stated security purpose.
The gymnastic co-founder proudly boasts he tracked ISIS recruits with Palantir in Syria, which means he hasn’t been held accountable yet for creating the terrorists he claimed to be preventing.
No joke. Not an exaggeration.
The Peregrine co-founder literally could be charged with unsubstantiated accusations abroad. Just think, if he hadn’t been able to get away with Palantir harms like causing a rise in terrorism, and instead held accountable, he wouldn’t be hawking a domestic version of the same system expected to end in yet another societal disaster.
1960s: “Surveillance will prevent violence!”
Result: Harassed civil rights leaders, missed terrorists
2024: “Peregrine will transform data into impact!”
Coming Soon: Targeted activists, missed threats, record profits
The Only Thing We Learn…
The NYPD’s demographic unit spent years mapping Muslim communities after 9/11. Did it prevent any terrorism? No. Did it destroy community trust and create detailed data for targeting minorities? You bet!
Now Peregrine wants to “optimize resource allocation” with the same capabilities that have consistently optimized oppression while failing at safety.
Peregrine’s Future is Written in the Past
Every single time we’ve built these systems, they’ve failed at their stated purpose while succeeding spectacularly at political control. But I’m sure Peregrine’s version is so different because they were born out of Palantir’s total failure and pivoted on terms like domestic “data-driven” and “real-time analytics” to more quickly incarcerate citizens into a Kafkaesque fever dream.
Here we are in 2024, watching a gymnast who helped create the threats he claimed to prevent in Syria perform his next routine: selling that same failed system to control Americans. The military-industrial-congressional-complex judges might give him perfect scores, but history already knows how this performance ends.
But hey, at least the dashboards destroying society are pretty.
…right?
Just ask Syria.
Remember: These systems don’t fail at threat detection.
They succeed at threat creation.
That’s not a bug.
That’s the business model.
When the disgraced men of Colonel “Mad Mike” Hoare stood trial in 1982 for a failed mercenary operation in the Seychelles, he justified their actions by claiming to be “the bastion of civilization in Africa” fighting against what he called a “Communist onslaught.”
Hoare’s mercenary force, the “Wild Geese,” embodied a militant survivalist ethos popularized in certain South African circles during the Cold War.[1]
This wasn’t happening in isolation, and rather in concert with wider efforts to spread white nationalist thinking around the world. In 1940, Elon Musk’s grandfather was arrested in Canada as a national security risk for his leadership role in the “Technocracy” movement of extreme racism. Despite having built himself an elite life with a 20-room home and private aircraft, he fled to South Africa specifically to help lead a newly established post-WWII apartheid regime… building an even bigger home and more private aircraft on the back of state-sanctioned racism. His vision of a technologically-enforced white ethnostate for his children and grandchildren would echo through generations, and set the stage for his grandson Elon Musk’s push into the same vision.
By 1988-89, as apartheid crumbled under international pressure for democratic reform, wealthy beneficiaries of the system rushed to move their assets internationally. The Musk family, by their own account, hurriedly sold everything and moved their racist pile of wealth to Canada, following a common pattern of capital extraction before a racist regime’s collapse. This mirrored earlier patterns of flight by others, such as Peter Thiel’s parents, who profited immensely from racist extraction systems and sought to avoid accountability after war and during democratic transitions.
Today, these survivalist themes have evolved into something more insidious: a sophisticated form of technological fraud that preys upon the same fears and desires that once made advance-fee schemes so effective. Just as “Nigerian prince” scams targeted educated professionals by exploiting their specific blind spots about international finance and wealth extraction, today’s marketing of “apocalypse-proof” vehicles exploits educated consumers’ technological blind spots. Doctors, lawyers, and other highly trained professionals who would never fall for a crude email scam find themselves vulnerable to slick presentations about “full self-driving” and “bulletproof” vehicles — precisely because their expertise in other fields doesn’t transfer to evaluating complex engineering claims.
The Cybertruck represents a masterclass in this kind of deception. Its highly targeted disinformation pitches transform military survivalist themes of white nationalism into a consumer product while playing on the same psychological vulnerabilities that make advance-fee fraud so persistent. Just as scammers promise vast riches for a small upfront investment, the Cybertruck promises “invincibility” for the price of a luxury vehicle to those prone to believing in a rapid elevation in selfish privilege. The same worldview that once described “wild humans” below private aircraft now markets vehicles as futurist personal fortresses against imagined threats — but beneath the poorly-designed false promises and dangerously-poor quality lies a deadly bait-and-switch.
The tragic deaths of three college students in Piedmont, California in November 2024 throws Elon Musk’s whole deception strategy into stark relief.
The Cybertruck, perhaps directly related to the fraud of its marketed indestructibility, abruptly “veered” off Hampton Road in the early morning hours, struck a tree and concrete wall, and burst into flames. This crash fits within a pattern familiar to experts in technological fraud: victims, believing in promised protections, take risks they otherwise wouldn’t for future promised gains that turn only into massive losses.
Just as a mark might drain their savings believing in guaranteed returns from an African prince, Cybertruck owners throw money at false confidence in their vehicle’s supposed “survival” design.
The latest crash isn’t unique to one Tesla model, but rather a well-known pattern for those reading the notes in Tesla’s rapidly rising death toll in every model. The National Highway Traffic Safety Administration (NHTSA) and National Transportation Safety Board (NTSB) are now investigating this predictable tragedy of Tesla. The Cybertruck has seen six recalls and three previous investigations of the vehicle this year alone, including an August 2024 crash in Texas with eerily similar characteristics that really should have “grounded” all Tesla vehicles… to save lives from fraud.
The even bigger through-line from 1980s South Africa to modern Silicon Valley reveals a consistent pattern and origin of the tragedy: sophisticated fraud schemes that exploit specific blind spots in otherwise capable people’s knowledge. From Hoare’s mercenaries believing that they could overthrow a government dressed as a “beer-drinking tourist party,” to wealthy Musk and Thiel families believing they could permanently extract wealth ahead of democratic transitions, to today’s marketing of “apocalypse-proof” consumer products, there is a theme that stays constant. The victims change, but the exploitation of targeted ignorance persists.
The Piedmont tragedy thus raises urgent questions about how racist 1980s South African-themed marketing narratives are showing up in the 2020s to influence risk perception, particularly among communities like Piedmont that desire “safety” so badly they fall victim to a snake oil salesman. When vehicles are marketed to wealthy families as virtually indestructible, it may create a false sense of future gains that sets them up instead for a tragic end, like the American Vietnam vet who was shot up and barely survived following Hoare’s failed coup attempt. The fact that this crash occurred at 3 AM, with young college students home for Thanksgiving break, suggests the deadly potential of combining buggy software, buggy hardware, and marketing that emphasizes the exact opposite of reality — the unmistakable bogus elixr of mystical invulnerability in a cooked-up vision of false threats.
As Captain Chris Monahan of the Piedmont police noted, they are “looking into actions that occurred before the collision.” But beyond the specific circumstances of this crash, we must examine the broader implications of marketing military-grade protection to civilian communities. Piedmont is known for its 0% Black population demographic surrounded by communities with 30% Black residents. Does anyone really think marketing to this community wouldn’t influence how people there — particularly young people — perceive risk as a function of their race-based privilege?
The candles and handmade cards left at the accident scene tell a different story than the glossy marketing. They remind us that nobody who dons a stainless steel version of the white robe with an X is truly invincible, no burning cross is an actual safety act, and that the cost of believing such things can be devastatingly high.
As investigations continue and the Piedmont community mourns, we must consider how the white flight mindset of survivalist marketing narratives have evolved from Hoare’s era to today — and more importantly, how they might influence decisions that put lives at risk. The same racist-driven swagger that led Hoare’s “Wild Geese” to disaster has been repackaged into a consumer product, but the potential for tragedy remains. Instead of getting themselves shot-up in a firefight trusting Elon Musk’s grandfather in 1981, these kids were driven by Elon Musk straight into a tree and burned alive in 2024.
For the sake of future young lives, we must look past false narratives of mystical indestructibility – whether they come from white African technocrats, white African mercenaries, or white African manufacturers promising a trip to Mars — and recognize that true safety comes not from racist narratives of apocalyptic survival, but from building democratic institutions that allow for actual risk science, known as peaceful reform and representation.
[1] “Cooked Goose”, Time Magazine, Monday, Aug 09, 1982 (provided here for reference from an original printed copy)
“Mad Mike” gets ten years.
During his five-month trial, Colonel Thomas Michael (“Mad Mike”) Hoare, who gained notoriety while soldiering for fortune in the Congo during the 1960s, put up a plucky front. During recesses Hoare entertained visitors with tales of his derring-do and signed copies of his swashbuckling biography, entitled Congo Mercenary. But last week the bravado was gone from the man who used to run a swaggering group of commandos in the Congo who called themselves the Wild Geese. His face ashen, Hoare, 63, slumped in his chair in a Pietermaritzburg courtroom as Judge Neville James found him and 42 fellow mercenaries guilty of airplane hijacking and sentenced Mad Mike to ten years in prison.
Hoare and his mercenary band of brothers were forced to stand trial following their bungled attempt last November to overthrow the socialist government of the Seychelles led by President Albert René. The armed mercenaries entered the Seychelles disguised as a beer-drinking tourist party, “The Ancient Order of Froth-Blowers.” Hoare’s objective was to return to power ex-President James Mancham, 49, a pro-Western leader who was deposed by René in a 1977 coup.
But the operation failed when a Mahé airport customs inspector found a weapon hidden in a Froth-Blower’s luggage. A gunfight broke out at the airport, in which one mercenary was killed and several oth ers wounded. Desperate to escape, the raiders fought their way to the control tower, guided an incoming Air India 707 to a landing and commandeered the plane. They forced the Air India pilot to fly them 2,500 miles across the Indian Ocean to Durban.
Lawyers for Hoare argued that the mercenaries had harmed no one nor demanded any ransom. Indeed, the government had initially released most of the men after their flight to South Africa, holding only Hoare and four others on the lesser charge of kidnaping, which carries no minimum sentence. But that leniency was abandoned after other nations, including the U.S., warned that South Africa could be struck from air-travel routings unless Pretoria enforced international agreements against harboring of air hijackers. The government then brought hijacking charges against all 43 of the escaped mercenaries.
Only one was declared not guilty last week: Charles William Dukes, an American veteran of Viet Nam, who was carried onto the 707 under heavy sedation after being seriously wounded during the Seychelles gunfight. He was ruled incapable of having taken part in the heist.
Throughout the trial, the Irish-born Hoare insisted that his operation had had the blessing of the South African government. “I see South Africa as the bastion of civilization in an Africa subjected to a total Communist onslaught,” he said. “I foresee myself in the forefront of this fight for our very existence.” Indeed, more than half of the convicted mercenaries had been members of either the South African Defense Force or the army reserve. There was also evidence that Soviet-made AK-47s and Chinese grenades and ammunition used by the mercenaries had been supplied by South African Defense Force officers.
Judge James declared that individuals in the National Intelligence Service and Defense Force had clearly known about the operation but, nonetheless, ruled that allegations of an official South African connection to the operation were “purely hearsay.” The day after the trial, Prime Minister P.W. Botha, who had refrained from commenting until the legal proceedings were completed, insisted that the government had not known of the affair. He charged that Hoare had approached members of the intelligence and military forces with his plan and admitted that arms and ammunition had been given to him. Botha said that “departmental action” would be taken against anyone who had cooperated with Hoare.
The colonel, who got his rank in the Congo, drew the stiffest sentence. His fellow raiders were given from six months to five years, and the judge later reduced most to six months. Hoare and his co-defendants were clearly the lucky ones. Last month four of Hoare’s soldiers of fortune who were left behind in the Seychelles were convicted of treason by René’s government. They are under sentence of death by hanging.
In 1941, Congress passed the National Cattle Theft Act to crack down on interstate cattle rustling. Today, tech giants face similar scrutiny for how they handle our personal data – but calling it “handling” is like calling cattle rustling “secret livestock relocation.” The cattle theft law was brutally simple: steal someone’s cow, cross state lines, face up to $5,000 in fines (about $100,000 today) and five years in prison. You either had someone else’s beloved Bessie or you didn’t.
But data “theft” in our age of artificial intelligence? America’s tech oligarchs have made sure nothing stays that simple. When companies like Microsoft and Google harvest our personal data to train AI systems, they’re not just taking, they’re effectively duplicating and breeding. Every piece of your digital life from search history to social media posts, photos to private messages is treated like human livestock in their “data” centers, endlessly duplicated and exploited across their server farms to maximize growth for exploitation. Unlike cattle rustlers who at least had to know how to tie a knot, these digital ranchers have convinced courts and Congress that copying and exploiting your life isn’t really theft at all. It’s just “data sharing.”
As described in a recent Barings Law article, these tech giants are being challenged on whether they can just repurpose our data for their benefits. Their defense? You clicked “I agree” on their deliberately incomprehensible terms of service.
It’s like a cattle rustler claiming the cow signed a contract. It’s like the Confederacy publishing books that the slaves liked it (true story, and American politicians still to this day try to corrupt schools into teaching slavery is good and accountability for it is bad).
[Florida 2023 law says] in middle school, the standards require students be taught slavery was beneficial to African Americans because it helped them develop skills…
The historical parallel that really fits today’s Big Tech agenda isn’t cattle theft — it’s darker, as in racist slavery darker. Think about how plantation owners viewed human beings as engines of wealth generation, officially designated as “planters” in a system where the colonus (farmer) became colonizer. Today’s tech giants have built a similar system of value multiplication, turning every scrap of our digital lives into seeds for their AI empires.
When oil prospectors engaged in highly illegal competitive horizontal drilling in Texas to literally undermine ownership boundaries, at least they were fighting over something finite. But data exploitation? It’s infinite duplication and leverage. Each tweet, each photo, each private message becomes raw material for generating endless new “property” all owned and controlled by the tech giants.
Have you seen Elon Musk’s latest lawsuit where he falsely tries to claim that all the user accounts in his companies are always owned solely by him and not the users who create them and use them?
The legal framework around data rights hasn’t evolved by accident. These companies have deliberately constructed a system where “consent” means whatever they want it to mean as long as it benefits them. Your data isn’t just taken, it’s being cloned, processed, and used to build AI systems that further concentrate power in their hands. Could you even argue that a digital version of you they present as authentic, isn’t actually you?
The stakes go far beyond simple questions of data ownership. We’re watching the birth of a new kind of wealth extraction that denies real consent; one that turns human experience itself into corporate property with no liberty or justice for anyone captured.
The historic cattle laws stopped rustlers. The historic oil laws eventually evolved to protect property owners from subsurface theft. Today’s challenge is recognizing and confronting how tech companies have built an empire on an expectation of unlimited exploitation of human lives just because they are digital too.
As these cases wind through the courts, we’re left with a crucial question: Will we let companies claim perpetual rights to multiply and profit from our digital lives just because we were dragged against our better judgment into their gigantic monopolistic services as if the magna carta never happened? Should clicking “I agree” grant infinite rights to extract value from our personal data, creative works, and social connections like we’re meant to be serfs under a digital kleptocrat?
The answer will shape not just our digital future, but our understanding of fundamental human rights in the age of artificial intelligence.