Horst D. Deckert

Meine Kunden kommen fast alle aus Deutschland, obwohl ich mich schon vor 48 Jahren auf eine lange Abenteuerreise begeben habe.

So hat alles angefangen:

Am 1.8.1966 begann ich meine Ausbildung, 1969 mein berufsbegleitendes Studium im Öffentlichen Recht und Steuerrecht.

Seit dem 1.8.1971 bin ich selbständig und als Spezialist für vermeintlich unlösbare Probleme von Unternehmern tätig.

Im Oktober 1977 bin ich nach Griechenland umgezogen und habe von dort aus mit einer Reiseschreibmaschine und einem Bakelit-Telefon gearbeitet. Alle paar Monate fuhr oder flog ich zu meinen Mandanten nach Deutschland. Griechenland interessierte sich damals nicht für Steuern.

Bis 2008 habe ich mit Unterbrechungen die meiste Zeit in Griechenland verbracht. Von 1995 bis 2000 hatte ich meinen steuerlichen Wohnsitz in Belgien und seit 2001 in Paraguay.

Von 2000 bis 2011 hatte ich einen weiteren steuerfreien Wohnsitz auf Mallorca. Seit 2011 lebe ich das ganze Jahr über nur noch in Paraguay.

Mein eigenes Haus habe ich erst mit 62 Jahren gebaut, als ich es bar bezahlen konnte. Hätte ich es früher gebaut, wäre das nur mit einer Bankfinanzierung möglich gewesen. Dann wäre ich an einen Ort gebunden gewesen und hätte mich einschränken müssen. Das wollte ich nicht.

Mein Leben lang habe ich das Angenehme mit dem Nützlichen verbunden. Seit 2014 war ich nicht mehr in Europa. Viele meiner Kunden kommen nach Paraguay, um sich von mir unter vier Augen beraten zu lassen, etwa 200 Investoren und Unternehmer pro Jahr.

Mit den meisten Kunden funktioniert das aber auch wunderbar online oder per Telefon.

Jetzt kostenlosen Gesprächstermin buchen

Big Brother on Board: UK Train Stations use Amazon-Powered AI to Read People’s Mood

“AI-powered surveillance could put all our privacy at risk, especially if misused, and Network Rail’s disregard of those concerns shows a contempt for our rights.”

In the UK, a series of AI trials involving thousands of train passengers who were unwittingly subjected to emotion-detecting software raises profound privacy concerns. The technology, developed by Amazon and employed at various major train stations including London’s Euston and Waterloo, as well as Manchester Piccadilly, used artificial intelligence to scan faces and assess emotional states along with age and gender. Documents obtained by the civil liberties group Big Brother Watch through a freedom of information request unveiled these practices, which might soon influence advertising strategies.

Over the last two years, these trials, managed by Network Rail, implemented “smart” CCTV technology and older cameras linked to cloud-based systems to monitor a range of activities. These included detecting trespassing on train tracks, managing crowd sizes on platforms, and identifying antisocial behaviors such as shouting or smoking. The trials even monitored potential bike theft and other safety-related incidents.

The data derived from these systems could be utilized to enhance advertising revenues by gauging passenger satisfaction through their emotional states, captured when individuals crossed virtual tripwires near ticket barriers. Despite the extensive use of these technologies, the efficacy and ethical implications of emotion recognition are hotly debated. Critics, including AI researchers, argue the technology is unreliable and have called for its prohibition, supported by warnings from the UK’s data regulator, the Information Commissioner’s Office, about the immaturity of emotion analysis technologies.

According to Wired, Gregory Butler, CEO of Purple Transform, has mentioned discontinuing the emotion detection capability during the trials and affirmed that no images were stored while the system was active. Meanwhile, Network Rail has maintained that its surveillance efforts are in line with legal standards and are crucial for maintaining safety across the rail network. Yet, documents suggest that the accuracy and application of emotion analysis in real settings remain unvalidated, as noted in several reports from the stations.

Privacy advocates are particularly alarmed by the opaque nature and the potential for overreach in the use of AI in public spaces. Jake Hurfurt from Big Brother Watch has expressed significant concerns about the normalization of such invasive surveillance without adequate public discourse or oversight.

Jake Hurfurt, Head of Research & Investigations at Big Brother Watch, said: “Network Rail had no right to deploy discredited emotion recognition technology against unwitting commuters at some of Britain’s biggest stations, and I have submitted a complaint to the Information Commissioner about this trial.

“It is alarming that as a public body it decided to roll out a large scale trial of Amazon-made AI surveillance in several stations with no public awareness, especially when Network Rail mixed safety tech in with pseudoscientific tools and suggested the data could be given to advertisers.’

“Technology can have a role to play in making the railways safer, but there needs to be a robust public debate about the necessity and proportionality of tools used.

“AI-powered surveillance could put all our privacy at risk, especially if misused, and Network Rail’s disregard of those concerns shows a contempt for our rights.”


MUST WATCH: Dr. David Martin Interview — U.S. Gov. Is Coordinating A Depopulation Program Against The World


Ähnliche Nachrichten