Thread Rating:
  • 1 Vote(s) - 1 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Apple opposes gov order to unlock iPhone
#1
http://nyti.ms/1Ltdr0U

This won't let me paste it here but they want a master key made by Apple to bypass any security and using the San Bernadino shooting as the reason for this key.
#2
(02-17-2016, 11:19 AM)StLucieBengal Wrote: http://nyti.ms/1Ltdr0U

This won't let me paste it here but they want a master key made by Apple to bypass any security and using the San Bernadino shooting as the reason for this key.


Here is the story:

Quote:SAN FRANCISCO — Apple said on Wednesday that it would oppose and challenge a federal court order to help the F.B.I. unlock an iPhone used by one of the two attackers who killed 14 people in San Bernardino, Calif., in December.


On Tuesday, in a significant victory for the government, Magistrate JudgeSheri Pym of the Federal District Court for the District of Central Californiaordered Apple to bypass security functions on an iPhone 5c used by Syed Rizwan Farook, who was killed by the police along with his wife, Tashfeen Malik, after they attacked Mr. Farook’s co-workers at a holiday gathering.

Judge Pym ordered Apple to build special software that would essentially act as a skeleton key capable of unlocking the phone.


But hours later, in a statement by its chief executive, Timothy D. Cook, Apple announced its refusal to comply. The move sets up a legal showdown between the company, which says it is eager to protect the privacy of its customers, and the law enforcement authorities, who say that new encryption technologies hamper their ability to prevent and solve crime.
[/url]
In his statement, Mr. Cook called the court order an “unprecedented step” by the federal government. “We oppose this order, which has implications far beyond the legal case at hand,” he wrote.

Asked about Apple’s resistance, the Justice Department pointed to a statement by[url=http://www.justice.gov/usao-cdca/meet-us-attorney]Eileen M. Decker
, the United States attorney for the Central District of California: “We have made a solemn commitment to the victims and their families that we will leave no stone unturned as we gather as much information and evidence as possible. These victims and families deserve nothing less.”

The F.B.I. said that its experts had been unable to access data on Mr. Farook’s iPhone, and that only Apple could bypass its security features. F.B.I. experts have said they risk losing the data permanently after 10 failed attempts to enter the password because of the phone’s security features.


The Justice Department had secured a search warrant for the phone, owned by Mr. Farook’s former employer, the San Bernardino County Department of Public Health, which consented to the search.


Because Apple declined to voluntarily provide, in essence, the “keys” to its encryption technology, federal prosecutors said they saw little choice but to get a judge to compel Apple’s assistance.


Mr. Cook said the order would amount to creating a “back door” to bypass Apple’s strong encryption standards — “something we simply do not have, and something we consider too dangerous to create.”


In 2014, Apple and Google — whose operating systems are used in 96 percent of smartphones worldwide — announced that they had re-engineered their software with “full disk” encryption, and could no longer unlock their own products as a result.


That set up a confrontation with police and prosecutors, who want the companies to build, in essence, a master key that can be used to get around the encryption. The technology companies say that creating such a key would have disastrous consequences for privacy.


“The F.B.I. may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a back door,” Mr. Cook wrote. “And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.”

An Apple spokeswoman declined to elaborate on the statement, but the company’s most likely next step is to file an appeal.


The legal issues are complicated. They involve statutory interpretation, rather than constitutional rights, and they could end up before the Supreme Court.


As Apple noted, the F.B.I., instead of asking Congress to pass legislation resolving the encryption fight, has proposed what appears to be a novel reading of the All Writs Act of 1789.


The law lets judges “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”


The government says the law gives broad latitude to judges to require “third parties” to execute court orders. It has cited, among other cases, a 1977 ruling requiring phone companies to help set up a pen register, a device that records all numbers called from a particular phone line.



Apple, in turn, argues that the scope of the act has strict limits. In 2005, a federal magistrate judge rejected the argument that the law could be used to compel a telecommunications provider to allow real-time tracking of a cellphone without a search warrant.

Marc J. Zwillinger, a lawyer for Apple, wrote in a letter for a related case in October that the All Writs Act could not be interpreted to “force a company to take possession of a device outside of its possession or control and perform services on that device, particularly where the company does not perform such services as part of its business and there may be alternative means of obtaining the requested information available to the government.”


The government says it does not have those alternative means.


Mr. Cook’s statement called the government’s demands “chilling.”


He added: “If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data.

The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.”


The Electronic Frontier Foundation, a nonprofit organization that defends digital rights, said it was siding with Apple.


“The government is asking Apple to create a master key so that it can open a single phone,” it said Tuesday evening. “And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.”


The San Bernardino case is the most prominent such case, but it is not the first.

Last October, James Orenstein, a federal magistrate judge in Brooklyn, expressed doubts about whether he could require Apple to disable its latest iPhone security features, citing the failure of Congress to resolve the issue despite the urging of the Justice Department.


The judge said such requests should fall under a different law, theCommunications Assistance for Law Enforcement Act of 1994, which covers telecommunications and broadband companies.


Congress has been debating whether to amend that act to include technology companies like Apple, Facebook and Google, and Judge Orenstein said he would consider ordering Apple to unlock the phone when and if Congress makes the change. That case is still pending.


Although Apple is portraying its opposition to Judge Pym’s order as a principled defense of privacy, one of its motivations is the preservation of its reputation for robust encryption, at a time of rising concerns aboutidentity theft, cybercrime and electronic surveillance by intelligence agencies and overzealous law enforcement agencies.


Apple also says that a master key would amount to a vulnerability that hackers could exploit.

COMMENT

China is watching the dispute closely. Analysts say that the Chinese government does take cues from the United States when it comes to encryption regulations, and that it would most likely demand that multinational companies provide accommodations similar to those in the United States.


Last year, Beijing backed off several proposals that would have mandated that foreign firms provide encryption keys for devices sold in China after heavy pressure from foreign trade groups. Nonetheless, a Chinese antiterrorism law passed in December required foreign firms to hand over technical information and to aid with decryption when the police demand it in terrorism-related cases.


While it is still not clear how the law might be carried out, it is possible a push from American law enforcement agencies to unlock iPhones would embolden Beijing to demand the same. China would also most likely push to acquire any technology that would allow it to unlock iPhones. Just after Apple introduced tougher encryption standards in 2014, Apple users in China were targeted by an attack that sought to obtain login information from iCloud users.
[Image: giphy.gif]
Your anger and ego will always reveal your true self.
#3
I'm with both sides.

Apple did right by initially telling them to get a warrant. But after the warrant, I can't see how Apple can avoid providing access to the information. The whole forcing them to make a "key" is another argument (I don't think they can), but if Apple can access the info, I think the warrant compels them to do so.

You have a right to privacy, but if there's just cause (in this case, an act of terrorism), then you're going to be searched. That extends to digital devices.
[Image: 4CV0TeR.png]
#4
Here is the letter to Apple customers.

http://www.apple.com/customer-letter/


Quote:February 16, 2016

A Message to Our Customers


The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand. 

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.



The Need for Encryption


Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.


All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.


Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.


For many years, we have used encryption to protect our customers’ personal data because we believe it’s the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.


The San Bernardino Case


We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government’s efforts to solve this horrible crime. We have no sympathy for terrorists.


When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and
search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.


We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.


Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.


The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.


The Threat to Data Security


Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.


In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.


The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.


The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.


We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.


A Dangerous Precedent


Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.


The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by “brute force,” trying thousands or millions of combinations with the speed of a modern computer.


The implications of the government’s demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone’s device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge.


Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.


We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.


While we believe the FBI’s intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.


Tim Cook
[Image: giphy.gif]
Your anger and ego will always reveal your true self.
#5
I think things like this need to be asked for on a case-by-case basis. I think the FBI asking for assistance in breaking into the iPhone in this situation is absolutely a reasonable request. I think creating an OS that would need to just be installed on any iPhone to make this happen and for it to be in the hands of anyone, let alone a federal agency, is a concerning thing.

If such an OS were to exist it would not stay secure. Someone would obtain it and use it against people. Murphy's Law.
#6
Thanks for posting that Dino.
#7
(02-17-2016, 11:50 AM)Benton Wrote: I'm with both sides.

Apple did right by initially telling them to get a warrant. But after the warrant, I can't see how Apple can avoid providing access to the information. The whole forcing them to make a "key" is another argument (I don't think they can), but if Apple can access the info, I think the warrant compels them to do so.

You have a right to privacy, but if there's just cause (in this case, an act of terrorism), then you're going to be searched. That extends to digital devices.


The warrant granted them access to the phone, their inability to get access is on them. They are trying to over interpret an old law to force them to develop technology that doesn't exist. That isn't a reasonable expectation and at some point it will be overturned. Encryption isn't illegal and until a law exists that require a back door then they are within their rights to refuse to develop one.
#8
(02-17-2016, 01:59 PM)Au165 Wrote: The warrant granted them access to the phone, their inability to get access is on them. They are trying to over interpret an old law to force them to develop technology that doesn't exist. That isn't a reasonable expectation and at some point it will be overturned. Encryption isn't illegal and until a law exists that require a back door then they are within their rights to refuse to develop one.

I agree with the bold and that's what I disagree with.

On the other hand, I think Apple is full of crap on that. When Steve Jobs resigned (and died a few months later), I would guess the first thing they did was take his work phone, work laptop and all other devices and then dug through them so they can stay afloat for the next decade until somebody else comes up  with a decent idea. If they have that technology (which I think they do) then it becomes debatable as to whether they should use it at the request of law enforcement.

It's like OnStar declining to track a vehicle that's stolen or that has a kidnapping victim. I think they should... until a warrant comes in. At that point, if they have the technology to assist and it isn't overly disruptive to company production, I think they should.
[Image: 4CV0TeR.png]
#9
(02-17-2016, 01:59 PM)Au165 Wrote: The warrant granted them access to the phone, their inability to get access is on them. They are trying to over interpret an old law to force them to develop technology that doesn't exist. That isn't a reasonable expectation and at some point it will be overturned. Encryption isn't illegal and until a law exists that require a back door then they are within their rights to refuse to develop one.

This.

This is a very slippery area where technology is ahead of the law.  I see the law being changed to make all encryption providers responsible for providing a key to the encryption.  People will still be able to buy encryption programs but they will know that if there is probable cause to believe they are involved in criminal activity the government will be able to access it.  

This will lead to a black market for encryption.  It will be treated like a type of weapon.  Criminals (and some people who don't trust the government) will buy illegal programs.    
#10
(02-17-2016, 02:08 PM)Benton Wrote: If they have that technology (which I think they do) then it becomes debatable as to whether they should use it at the request of law enforcement.

It's like OnStar declining to track a vehicle that's stolen or that has a kidnapping victim. I think they should... until a warrant comes in. At that point, if they have the technology to assist and it isn't overly disruptive to company production, I think they should.


I, and probably most people, agree with you, but the law will have to be changed.
#11
(02-17-2016, 01:55 PM)StLucieBengal Wrote: Thanks for posting that Dino.

ThumbsUp
[Image: giphy.gif]
Your anger and ego will always reveal your true self.
#12
(02-17-2016, 02:08 PM)Benton Wrote: I agree with the bold and that's what I disagree with.

On the other hand, I think Apple is full of crap on that. When Steve Jobs resigned (and died a few months later), I would guess the first thing they did was take his work phone, work laptop and all other devices and then dug through them so they can stay afloat for the next decade until somebody else comes up  with a decent idea. If they have that technology (which I think they do) then it becomes debatable as to whether they should use it at the request of law enforcement.

It's like OnStar declining to track a vehicle that's stolen or that has a kidnapping victim. I think they should... until a warrant comes in. At that point, if they have the technology to assist and it isn't overly disruptive to company production, I think they should.

Jobs died in 2011, their new encryption didn't come out until 2014. It's real and they can't break it.....that's kind of the point.

Not quite like OnStar, a person in a car has no reasonable expectation of privacy so it make it a lot easier. All customers who aren't doing illegal activity have an expectation of complete encryption from Apple (which was told to them when they purchased the product) and all of them could in fact sue Apple for providing such a key that could be used by the government post sale.
#13
(02-17-2016, 02:15 PM)Au165 Wrote:  all of them could in fact sue Apple for providing such a key that could be used by the government post sale.

No, they could not sue Apple for doing something that the government ordered them to do.

At the time Apple made the promise it was correct under the law.  They would not be held responsible for the law changing.
#14
BTW all they want apple to do is disable the portion of the program that locks it up after 10 failed attempts to enter the proper password. The government would then use some powerful program to try every possible combination for the password.
#15
(02-17-2016, 02:15 PM)Au165 Wrote: Jobs died in 2011, their new encryption didn't come out until 2014. It's real and they can't break it.....that's kind of the point.

Jobs was an example. Like I said, I find it hard to believe the company that created it has no way to access their own stuff.

Quote:Not quite like OnStar, a person in a car has no reasonable expectation of privacy so it make it a lot easier. All customers who aren't doing illegal activity have an expectation of complete encryption from Apple (which was told to them when they purchased the product) and all of them could in fact sue Apple for providing such a key that could be used by the government post sale.

To the bold, it doesn't matter. In 2013 the SCOTUS already ruled that for serious offenses they can swab your mouth for DNA. They can take your fingerprints. They can't store it until after trial, but they can take it whether you object or not. Provided they have a warrant. Same as anything you have written in a notebook, emails, any tangible items you may have used to commit a crime.

As far as the expectation of encryption, meh. Data theft happens every day from iPhones and Apple hasn't stopped that. A terrorist attack occurs, information on the phone could prevent other attacks and suddenly Apple becomes concerned with protecting data? And no, I'm not suggesting they support terrorism. I'm just saying their logic is flawed. Their phones are already exploitable, so saying they can't do it to a better degree than somebody working out of their mom's basement is silly.
[Image: 4CV0TeR.png]
#16
(02-17-2016, 02:26 PM)fredtoast Wrote: No, they could not sue Apple for doing something that the government ordered them to do.

At the time Apple made the promise it was correct under the law.  They would not be held responsible for the law changing.

Good point. And there's probably something in the EULA to that effect.

Not that anyone reads those.
[Image: 4CV0TeR.png]
#17
If it was an easy thing to do, Apple should. From the sound of it, what they're requesting isn't available and they're worried that if it were, it could leak and be destructive. There certainly needs to be something in the future that allows them, if possible, but authorities need to accept the reality that maybe they won't be able to do anything about it.
[Image: ulVdgX6.jpg]

[Image: 4CV0TeR.png]
#18
(02-17-2016, 02:37 PM)Benton Wrote: Jobs was an example. Like I said, I find it hard to believe the company that created it has no way to access their own stuff.


To the bold, it doesn't matter. In 2013 the SCOTUS already ruled that for serious offenses they can swab your mouth for DNA. They can take your fingerprints. They can't store it until after trial, but they can take it whether you object or not. Provided they have a warrant. Same as anything you have written in a notebook, emails, any tangible items you may have used to commit a crime.

As far as the expectation of encryption, meh. Data theft happens every day from iPhones and Apple hasn't stopped that. A terrorist attack occurs, information on the phone could prevent other attacks and suddenly Apple becomes concerned with protecting data? And no, I'm not suggesting they support terrorism. I'm just saying their logic is flawed. Their phones are already exploitable, so saying they can't do it to a better degree than somebody working out of their mom's basement is silly.

Once again, you don't understand how full disk works. It is taking a safe that is unpick able, and letting someone else set the combination. The manufacture of the safe can't get in to it, that is the point of coming up with these super high end security features.

The second point has no impact on what I am saying, OnStar tracking you is not the same as forcing a company to add a backdoor into a secure product. It is a completely irrelevant comparison.

This last point is false, and actually shows a complete ignorance on the subject.
#19
(02-17-2016, 02:28 PM)fredtoast Wrote: BTW all they want apple to do is disable the portion of the program that locks it up after 10 failed attempts to enter the proper password.  The government would then use some powerful program to try every possible combination for the password.

It's a brute force attack . The issue is you can't rewrite the software that enables the 10 try self destruct without first having access to the system....which you don't have until you get in to it. This is where the whole idea becomes crazy.
#20
(02-17-2016, 02:26 PM)fredtoast Wrote: No, they could not sue Apple for doing something that the government ordered them to do.

At the time Apple made the promise it was correct under the law.  They would not be held responsible for the law changing.

That would be true if the law change, as it stands there is nothing illegal about encryption and they were promised full encryption. To surrender the key without a very strong legal opposition to it would be failing to deliver upon the marketed feature. One of the reasons I don't think the order will stand is this very argument, encryption isn't illegal and to provide a back door would be counter intuitive to the thing they are selling. Make it illegal or deal with it.





Forum Jump:


Users browsing this thread: 1 Guest(s)