The Guernica of AI - A warning from a former Palantir employee in a new American crisis

PAF

Member
Joined
Feb 26, 2012
Messages
13,559
by Juan Sebastián Pinto , former Palantir employee

.
.

Gaza, one of the most extensive testing grounds of AI-enabled air doctrine to date, is today’s equivalent of Guernica in 1937. Over the past year of conflict, it has become the latest testing ground of breakthrough warfare technologies on a confined, civilian population — and a warning for more atrocities to come. The Israeli Defense Forces’ use of American bombs and AI-powered kill lists generated, supported, and hosted by American AI software companies has inflicted catastrophic civilian casualties, with estimates suggesting up to 75% of victims being non-combatants. Lavender, an error-prone, AI-powered kill list platform used to drive many of the killings, has been strongly linked to (if not inspired by) the American big-data company Palantir. Intelligence agents for the IDF have anonymously revealed that the system deemed 100 civilian casualties an acceptable level of collateral damage when targeting senior Hamas leaders.

Yet, instead of reckoning with AI’s role in enabling humanitarian crimes, the public conversation on the subject of AI has largely revolved around sensationalized stories driven by deceptive marketing narratives and exaggerated claims. Stories which, in part, I helped shape. Stories which are now being leveraged against the American people, in the rapid adoption of evolving AI technologies across the public and private sector — all upon an audience that still doesn’t understand the full implications of big-data technologies and their consequences.

I know because — for a year and a half after the pandemic — I worked at Palantir Technologies  from their new headquarters in Denver, Colorado. There I marketed their core software offerings — Gotham, Foundry, and Apollo — while also developing written materials and diagrams regarding the AI kill chain: the semi-autonomous network of people, processes, and machines (including drones) involved in executing targets in modern warfare. These technologies, which Palantir co-developed with the Pentagon in Project Maven, sought to become the “spark that kindles the flame front of artificial intelligence across the rest of the Department" according to Lt. Gen. of the United States Air Force Jack Shanahan.

But this was only the beginning. While helping my team explain the advantages of AI warfare to US defense departments, I would also simultaneously help sell AI technologies to Fortune 100 companies, civilian government agencies, and even foreign governments in a range of applications, from healthcare to sales.

For a time, I truly felt — as Palantir CEO Alex Karp recently put it — that the Palantir “degree” was the best degree I could get. That I would mostly participate in creating efficient and needed solutions to the world’s most complicated problems. However, over the course of bringing dozens of applications to market, I would soon come to a dark personal realization: The core idea underlying most commercial AI analytics applications today — and the philosophy underlying the kill chain framework in the military — is that   through continual surveillance, data analysis, and machine learning  we can achieve a simulated version of the world, where a nation, army, or corporation can gain competitive advantage only by knowing everything about targets and delivering results autonomously before their adversaries.

Building these competing simulations, of a factory, a battleground, a connected vehicle fleet  —  often called “digital twins”  — is not only the business of Palantir, but of established technology players like IBM, Oracle and hundreds of new startups alike. They are all furiously staking their market share of data applications across every industry and world government, unleashing a paranoid process of comprehensive digitalization and simulation, while establishing surveillance infrastructures and “moats” of proprietary knowledge, information, and control in every market and in every corner of our lives.

.
.

.

Palantir, Anduril, SpaceX, and OpenAI are now reportedly in talks to form a consortium to bid on defense contracts, meanwhile, Google has abandoned its pledge of not using its technology for weapons and surveillance systems. Next, Palantir’s CEO, Alex Karp — along with many other tech leaders and their political and business allies — will argue that we should become a “technological republic” and that it’s time we welcome the intervention of Silicon Valley startups into many more of our government and public institutions. Along with many other tech companies vying for a piece of the action, they stand ready to transform many of our democratic systems, government functions, and decision-making with largely unproven technologies reined in by few restrictions and controlled by the most powerful individuals in the world.

.
.
.



Continue to full article:


 
Last edited:
by Juan Sebastián Pinto , former Palantir employee

.
.

Gaza, one of the most extensive testing grounds of AI-enabled air doctrine to date, is today’s equivalent of Guernica in 1937. Over the past year of conflict, it has become the latest testing ground of breakthrough warfare technologies on a confined, civilian population — and a warning for more atrocities to come. The Israeli Defense Forces’ use of American bombs and AI-powered kill lists generated, supported, and hosted by American AI software companies has inflicted catastrophic civilian casualties, with estimates suggesting up to 75% of victims being non-combatants. Lavender, an error-prone, AI-powered kill list platform used to drive many of the killings, has been strongly linked to (if not inspired by) the American big-data company Palantir. Intelligence agents for the IDF have anonymously revealed that the system deemed 100 civilian casualties an acceptable level of collateral damage when targeting senior Hamas leaders.

Yet, instead of reckoning with AI’s role in enabling humanitarian crimes, the public conversation on the subject of AI has largely revolved around sensationalized stories driven by deceptive marketing narratives and exaggerated claims. Stories which, in part, I helped shape. Stories which are now being leveraged against the American people, in the rapid adoption of evolving AI technologies across the public and private sector — all upon an audience that still doesn’t understand the full implications of big-data technologies and their consequences.

I know because — for a year and a half after the pandemic — I worked at Palantir Technologies  from their new headquarters in Denver, Colorado. There I marketed their core software offerings — Gotham, Foundry, and Apollo — while also developing written materials and diagrams regarding the AI kill chain: the semi-autonomous network of people, processes, and machines (including drones) involved in executing targets in modern warfare. These technologies, which Palantir co-developed with the Pentagon in Project Maven, sought to become the “spark that kindles the flame front of artificial intelligence across the rest of the Department" according to Lt. Gen. of the United States Air Force Jack Shanahan.

But this was only the beginning. While helping my team explain the advantages of AI warfare to US defense departments, I would also simultaneously help sell AI technologies to Fortune 100 companies, civilian government agencies, and even foreign governments in a range of applications, from healthcare to sales.

For a time, I truly felt — as Palantir CEO Alex Karp recently put it — that the Palantir “degree” was the best degree I could get. That I would mostly participate in creating efficient and needed solutions to the world’s most complicated problems. However, over the course of bringing dozens of applications to market, I would soon come to a dark personal realization: The core idea underlying most commercial AI analytics applications today — and the philosophy underlying the kill chain framework in the military — is that   through continual surveillance, data analysis, and machine learning  we can achieve a simulated version of the world, where a nation, army, or corporation can gain competitive advantage only by knowing everything about targets and delivering results autonomously before their adversaries.

Building these competing simulations, of a factory, a battleground, a connected vehicle fleet  —  often called “digital twins”  — is not only the business of Palantir, but of established technology players like IBM, Oracle and hundreds of new startups alike. They are all furiously staking their market share of data applications across every industry and world government, unleashing a paranoid process of comprehensive digitalization and simulation, while establishing surveillance infrastructures and “moats” of proprietary knowledge, information, and control in every market and in every corner of our lives.

.
.

.

Palantir, Anduril, SpaceX, and OpenAI are now reportedly in talks to form a consortium to bid on defense contracts, meanwhile, Google has abandoned its pledge of not using its technology for weapons and surveillance systems. Next, Palantir’s CEO, Alex Karp — along with many other tech leaders and their political and business allies — will argue that we should become a “technological republic” and that it’s time we welcome the intervention of Silicon Valley startups into many more of our government and public institutions. Along with many other tech companies vying for a piece of the action, they stand ready to transform many of our democratic systems, government functions, and decision-making with largely unproven technologies reined in by few restrictions and controlled by the most powerful individuals in the world.

.
.
.



Continue to full article:


What has been written here reminds me of the lunchtime discussions we had at "the project". None of this is news to people who have been in the field. In a world where the American "government" was not an evil entity on the whole, I would be strongly advocating they get that bull by its balls while it is still a baby because when it is grown, there will likely be no reeling it in. But because all "government" doesn't rise to so much as diarrhea, such words would be wasted because said entities will be the primary customers of the vendors.

I agree with the implication that the mean public has absolutely no clue what is coming, and when it gets here in force they are going to find out very quickly just what an error it is they made when they decided that Monday Night Football and pro wrestling were more important that looming concerns such as this.

Thankfully, I am approaching the end of my days, though I may yet live to see the shit slam into the fan. It ahould be amusing to observe the reactions, though now that I'm thinking about it, these impositions may be accepted with great joy by the mean man. His addiction to convenience sets him up as a perfect candidate for becoming a whore at the beck and call of the Candyman. Time will tell.
 
What has been written here reminds me of the lunchtime discussions we had at "the project". None of this is news to people who have been in the field. In a world where the American "government" was not an evil entity on the whole, I would be strongly advocating they get that bull by its balls while it is still a baby because when it is grown, there will likely be no reeling it in. But because all "government" doesn't rise to so much as diarrhea, such words would be wasted because said entities will be the primary customers of the vendors.

I agree with the implication that the mean public has absolutely no clue what is coming, and when it gets here in force they are going to find out very quickly just what an error it is they made when they decided that Monday Night Football and pro wrestling were more important that looming concerns such as this.

Thankfully, I am approaching the end of my days, though I may yet live to see the shit slam into the fan. It ahould be amusing to observe the reactions, though now that I'm thinking about it, these impositions may be accepted with great joy by the mean man. His addiction to convenience sets him up as a perfect candidate for becoming a whore at the beck and call of the Candyman. Time will tell.

"Thankfully, I am approaching the end of my days, though I may yet live to see the shit slam into the fan. It ahould be amusing to observe the reactions, though now that I'm thinking about it, these impositions may be accepted with great joy by the mean man. His addiction to convenience sets him up as a perfect candidate for becoming a whore at the beck and call of the Candyman. Time will tell."



Revelation 11:7-10: "And when they have finished their testimony, the beast that rises from the bottomless pit will make war on them and conquer them and kill them, and their dead bodies will lie in the street of the great city... For three and a half days some from the peoples and tribes and languages and nations will gaze at their dead bodies and refuse to let them be placed in a tomb, and those who dwell on the earth will rejoice over them and make merry and exchange presents, because these two prophets had been a torment to those who dwell on the earth."


Revelation 9:20-21: "The rest of mankind, who were not killed by these plagues, did not repent of the works of their hands nor give up worshiping demons and idols of gold and silver and bronze and stone and wood, which cannot see or hear or walk, nor did they repent of their murders or their sorceries or their sexual immorality or their thefts."

Revelation 16:9: "They were scorched by the fierce heat, and they cursed the name of God who had power over these plagues. They did not repent and give him glory."

Revelation 16:11: "and cursed the God of heaven for their pain and sores. They did not repent of their deeds."

...

People always say, "If God's real, why doesn't he 'do' something? Why doesn't he 'prove' it?" He did, He does, and He will. But as He is making quite obvious in these modern times (e.g. "flat earthers", "genocide deniers"), doesn't mean people are going to believe even if He goes Old Testament (He will).
 
Back
Top