Pegasus used to infect iPhones owned by Catalonian officials

Posted:
in iPhone
Pegasus, NSO Group's spyware used to hack iPhones, has been caught up in another spying scandal, with the surveillance tool used against devices owned by civil society and political figures in Catalonia, Spain.

NSO Group, who made Pegasus and sold it to law enforcement agencies and governments.
NSO Group, who made Pegasus and sold it to law enforcement agencies and governments.


Following a 2020 report claiming senior Catalonian politician Roger Torrent and pro-independence supporters were targeted by "government-grade spyware" via WhatsApp, Citizen Lab launched an investigation into wider spyware use against officials and people of interest in the region. On Monday, the investigation revealed evidence that another tool was used: Pegasus.

At least 63 people were targeted or were infected by Pegasus, the report claims, while four others were targeted by Candiru spyware, and two were targets of both tools. The list of victims included Catalan presidents, legislators, members of civil society organizations, members of the European Parliament, and family members.

While Citizen Lab doesn't directly attribute blame for the attacks, it does say there's extensive circumstantial evidence pointing in the direction of the Spanish government.

As one of the wealthiest autonomous regions of Spain, Catalonia has a long history of attempting to grow its autonomy, typically opposed by the Spanish government. This was especially evident in 2017 during an independence referendum that was deemed illegal by the Spanish Constitutional Court, with police allegedly turning away voters and supposedly using excessive force.

Shortly after the vote was approved by the Catalan Parliament, the Spanish government dissolved that parliament and scheduled new elections. Since then, participants in the referendum were sent to prison, and Spain continues to fight the independence movement.

The investigation determined that of 63 targets, 51 were found to have forensically-confirmed infections. However, since Spain has a high prevalence for Android over iOS, and that forensic tools used by investigators are more developed for iOS, the report believes it "heavily undercounts the number of individuals likely targeted and infected with Pegasus because they had Android devices."

Several instances of "off-center" targeting were spotted, where family members, close members of staff, and other individuals connected to a person of interest were infected, enabling data collection about the subject without necessarily maintaining a connection.

All Catalan Members of the European Parliament that supported independence were targeted, either directly or off-center, including three direct infections of MEPs and two off-center attacks.

Other identified targets include civil societies that supported political independence, such as Assemblea Nacional Catalana, Omnium Cultural, and lawyers representing prominent Catalans.

"Homage" and evidence

In terms of how Pegasus worked, zero-click iMessage exploits were attempted between 2017 and 2020, a pretty common technique. However, in late 2019, a zero-click exploit was discovered, which has been called "Homage."

Homage involved an iMessage zero-click component launching a WebKit instance, after performing a lookup for a Pegasus email address. JavaScript scaffolding was fetched by the WebKit instance, which then fetched the exploit itself.

The scaffolding could even determine the model of iPhone by comparing screen resolutions for possible matches, whether "display zoom" mode is engaged, and the time it took to encrypt a buffer.

It appeared that domains linked to the exploits were controlled by a single Pegasus customer, indicating that it was all performed by one entity. Spain's Centro Nacional de Inteligencia (CNI) was reportedly a customer of NSO Group, with the country's Ministry of Interior potentially able to perform the same attacks.

Other circumstantial evidence includes the timing of targeting that appeared to be of interest to the Spanish government, the content of bait text messages inferred access to personal information like official ID numbers, and the targets being of "obvious interest to the Spanish government."

Citizen Lab believes the seriousness of the case "clearly warrants an official inquiry to determine the responsible party, how the hacking was authorized," the legal framework, the scale of the operation, and what hacked data was used for. It also viewed the case as notable "because of the unrestrained nature of the hacking activities."

The report into Catalan attacks using Pegasus arrive a week after it was determined senior European Commission officials were targeted by attackers in 2021, using the same tools to try and gain access to smartphones.

Read on AppleInsider

Comments

  • Reply 1 of 9
    lkrupplkrupp Posts: 10,557member
    So at least we now know that ALL governments are hell bent on surveilling their citizens and will do anything to prevent effective security and privacy. 
    docno42FileMakerFellerjony0
  • Reply 2 of 9
    docno42docno42 Posts: 3,755member
    Ugh - wish there was a way to just disable SMS texts.  Nothing but spam and malware - utterly useless.  Many of these exploits don't even require you to view the message in messages - just your phone receiving them was enough to launch the exploit.

    It boggles my mind that after all these years input validation and sanitization still sucks.  That programmers resist things like type-safe languages because it takes more up front work.  Well duh!  It's a shitshow out there.  It didn't matter as much when machines weren't networked; yet crap still spread via disk.  Now computers are networked but "computer science" has advanced relatively little in cleaning this crap up.  

    I'm far from a big government type of person, but I am a pragmatist. I think it's far beyond time to start treating computer science like professional engineering.  No more EULAs hand waving all responsibility for software quality away.  Individuals should be required to sign off on software builds - especially for critical systems like operating systems - just like engineers have to sign off on and are personally liable for plans.  

    Enough is enough.  
    edited April 2022 lkruppFileMakerFeller
  • Reply 3 of 9
    lkrupplkrupp Posts: 10,557member
    docno42 said:
    Ugh - wish there was a way to just disable SMS texts.  Nothing but spam and malware - utterly useless.  Many of these exploits don't even require you to view the message in messages - just your phone receiving them was enough to launch the exploit.

    It boggles my mind that after all these years input validation and sanitization still sucks.  That programmers resist things like type-safe languages because it takes more up front work.  Well duh!  It's a shitshow out there.  It didn't matter as much when machines weren't networked; yet crap still spread via disk.  Now computers are networked but "computer science" has advanced relatively little in cleaning this crap up.  

    I'm far from a big government type of person, but I am a pragmatist. I think it's far beyond time to start treating computer science like professional engineering.  No more EULAs hand waving all responsibility for software quality away.  Individuals should be required to sign off on software builds - especially for critical systems like operating systems - just like engineers have to sign off on and are personally liable for plans.  

    Enough is enough.  
    Yes, if software developers were legally liable for the flaws in their products, especially when it comes to security flaws, maybe they would practice better due diligence. How many times do we read about exploits made possible by buffer overflows and faulty input checking, the very basics of coding? Why does it take the software being let loose on the public for these bugs to be found?

    One of my son’s is a structural engineer and, like you point out, if his name is on the blueprints he is responsible if something goes wrong due to his engineering designs or calculations. That’s why we have the PE (professional engineering) certifications. People can die if the engineering is faulty and the bridge or building collapses. The same should go for software engineering.

    Now we’re worried about the Russians waging cyberwar on our infrastructure. Why? Because the software running it has holes in it, that’s why. Last night’s 60 Minutes had a segment about how the Russians are constantly probing our infrastructure like power, water, food, petroleum, looking for ways into the systems and planting malware for future activation. 
    edited April 2022 rotateleftbyteFileMakerFellerjony0docno42
  • Reply 4 of 9
    lkrupp said:
    One of my son’s is a structural engineer and, like you point out, if his name is on the blueprints he is responsible if something goes wrong due to his engineering designs or calculations. That’s why we have the PE (professional engineering) certifications. People can die if the engineering is faulty and the bridge or building collapses. The same should go for software engineering.

    Now we’re worried about the Russians waging cyberwar on our infrastructure. Why? Because the software running it has holes in it, that’s why. Last night’s 60 Minutes had a segment about how the Russians are constantly probing our infrastructure like power, water, food, petroleum, looking for ways into the systems and planting malware for future activation. 
    I approached writing software like your son does structural engineering. Being a Control Systems Engineer by profession, making bulletproof software was a matter of pride for me. Yes, it took longer to code, test, and everything but it was worth it in the long run.
    I lost count of the run-ins I had with scrum masters who wanted apparently little things like error handling relegated to technical debt which would never get attended to. Most of the time, I delivered very robust software that needed little attention while in operation. 
    The problem is that delivering code comes out of CapEx. Fixing it later comes out of OpEx.
    Quality costs. It can be done but few companies want it. They want something delivered NOW and for zero cost.
    There is a saying in the north of England...
     you don't get owt for nowt. 

    Very true
    edited April 2022 appleinsideruserFileMakerFellerdocno42
  • Reply 5 of 9
    mknelsonmknelson Posts: 1,124member
    lkrupp said:
    So at least we now know that ALL governments are hell bent on surveilling their citizens and will do anything to prevent effective security and privacy. 
    So, Spain was the last one on your "ALL governments" hyperbole bingo card?
    crowleylkruppjony0
  • Reply 6 of 9
    jimh2jimh2 Posts: 614member
    lkrupp said:
    docno42 said:
    Ugh - wish there was a way to just disable SMS texts.  Nothing but spam and malware - utterly useless.  Many of these exploits don't even require you to view the message in messages - just your phone receiving them was enough to launch the exploit.

    It boggles my mind that after all these years input validation and sanitization still sucks.  That programmers resist things like type-safe languages because it takes more up front work.  Well duh!  It's a shitshow out there.  It didn't matter as much when machines weren't networked; yet crap still spread via disk.  Now computers are networked but "computer science" has advanced relatively little in cleaning this crap up.  

    I'm far from a big government type of person, but I am a pragmatist. I think it's far beyond time to start treating computer science like professional engineering.  No more EULAs hand waving all responsibility for software quality away.  Individuals should be required to sign off on software builds - especially for critical systems like operating systems - just like engineers have to sign off on and are personally liable for plans.  

    Enough is enough.  
    Yes, if software developers were legally liable for the flaws in their products, especially when it comes to security flaws, maybe they would practice better due diligence. How many times do we read about exploits made possible by buffer overflows and faulty input checking, the very basics of coding? Why does it take the software being let loose on the public for these bugs to be found?

    One of my son’s is a structural engineer and, like you point out, if his name is on the blueprints he is responsible if something goes wrong due to his engineering designs or calculations. That’s why we have the PE (professional engineering) certifications. People can die if the engineering is faulty and the bridge or building collapses. The same should go for software engineering.

    Now we’re worried about the Russians waging cyberwar on our infrastructure. Why? Because the software running it has holes in it, that’s why. Last night’s 60 Minutes had a segment about how the Russians are constantly probing our infrastructure like power, water, food, petroleum, looking for ways into the systems and planting malware for future activation. 
    Beyond absurd. Building plans are so limited in scope that there are few if any unknowns and there certainly is nothing that cannot be solved or the drawings are reworked. Software is infinitely more complex as the codebase is impossible for one person or even a team of people to understand, much less sign off on other than saying it is a best effort. Factor in the manufacturers firmware (for each component), behind the scenes private API, public API, third-party API's and you have no way to resolve every possible error. Even if you could 100% validate all of the components and software layers who would be responsible for an error that occurs with communication between two different layers or components. The task is impossible and we would be in a world with very little, super expensive software or all software would be unsigned. Of course there would be a huge waiver saying anything that happens is your problem and if you do not like it then do not buy.

    Your grasp of where computer science is versus where it was 10 or 20 or 30 years ago is flawed. If you developed software 30 years ago and could look forward to see what was available today you would just quit and take another career path. Short of the C standard library and manufacturer API's there was very little selection of software libraries to do the most basic of tasks. In most cases you had to write it from scratch if you needed it. Now with a little searching you find libraries to do almost anything you can think of, most of which are free to use with some exceptions depending on the license.

    Everyone is doing the best they can as no one wants to publish flawed software. Think of something as simple as door lock or padlock. Every lock you can find is pickable no matter how sophisticated and in most cases the picking is not difficult. A 1000+ years of locks and they are all pickable. Shall we require sign off on something as simple and fundamental as a lock. There is not a single lock sold at any hardware store that cannot be picked by someone with a modest skillset. You can go to a locksmith and the advantage there is they might be a little harder to pick, but are still very pickable. The best require you order a key through a locksmith so you cannot easily get an extra, but that is more of a pain than a security feature.
    crowleyM68000muthuk_vanalingam
  • Reply 7 of 9
    From my University days in the late 1980s/early 1990s:

    One of the "Real Programmers" sayings was "Strong typecasting is for people with weak memories"

    One of my classes had a policy: "If your code does not work for every test, it does not work." Sadly, only one class. But everyone who took it remains paranoid to this day. A lot of them became software testers. :)
  • Reply 8 of 9
    Spain is not a real democracy when it comes to the National Minority of Catalonia and of course Pegasus was one of their tools to exert their control. Let’s not forget that Spain also banned websites that talked about the benefits of independence, incarcerated catalan activists that made fan of Spanish institutions on twitter,  as well as incarcerated catalan activist and leaders. 
  • Reply 9 of 9
    docno42docno42 Posts: 3,755member
    jimh2 said:
    Everyone is doing the best they can
    That is complete and utter bullshit.  It’s a shitshow of interaction because there is little incentive to formalize things. Computers are no longer used by hobbyists - critical infrastructure is running on commonplace operating systems!  “Because it’s hard” is a pretty weak cop-out.  Yeah, it’s hard - but it’s far from impossible and no, we ARE NOT doing the best we can.

    Hell, how many systems are exposed directly to the Internet purely because of convenience?  Looking through scans from sites like SHODAN is a horror show.  Stop making excuses for bad behavior - and stop calling it computer science.  There is nothing scientific to the way we handle computers professionally today. 

    Also the ridiculous strawman of the week award goes to your inane lock analogy.  When someone can anonymously pick your lock from anywhere in the world as long as the have a connection to the Internet you might have a point.  If anything you prove exactly the opposite of what you meant to and serve as the poster child for how we ended up here: people suck at assessing and mitigating real risk.
    edited April 2022
Sign In or Register to comment.