Hacker News new | past | comments | ask | show | jobs | submit login
Facebook Planned to Spy on Android Phone Users, Internal Emails Reveal (twitter.com/ashk4n)
200 points by DyslexicAtheist on Feb 23, 2019 | hide | past | favorite | 68 comments



How is Facebook not producing malware?

I struggle to see how they’re legally different from malvertising: perhaps there’s some vague terms about data or advertising (which also exists for malvertising! you agreed to the ads!) but at no point was there a “meeting of the minds” in which Facebook was authorized to bypass security mechanisms to exfiltrate data from the system.

I don’t see how Facebook’s behavior is anything but classic “hacking”, and a willful abuse of the CFAA.

I think it’s time we started addressing gigacriminals[0] like Zuckerberg: we need to stop pandering to people who commit millions of crimes a day, at “web scale”, and just use force.

If Facebook doesn’t want to get the memo that wanton criminality isn’t okay, then it’s time to use force to correct the behavior.

[0] I use gigacriminal in the technical sense: I believe Mark Zuckerberg, as the leader of an organization, has ordered over 1 billion criminal acts to be commmitted for his profit.


FB is making a lot of people a lot of money. Good luck getting traction on getting Zuck to stop.

Also, while I think Zuck is devious (based on the history of FB), I think Sandburg is the brains of the worst of FB.


Send him to prison


Lock 'em up!


I know we have lots of FB employees at HN.

I would be genuinely curious to hear, via throwaway accounts if need be, about how FB staff rationalise things like this happening.

Do you shrug it off as not a big deal in the long run? As FB still doing a net amount of good versus what you perceive as isolated incidents like this? I'm just in good faith trying to figure out how people willingly work and continue to work for outfits that repeatedly engage in behaviour such as this. I know there are lots of speculative reasons we can put forward, but I think we have a great opportunity here in our community to have first-hand input.


I'm in a similar boat.

I worked with a group at Facebook, and I almost refused to take the project on. I'm the type that has deleted me Facebook and uses a blocker to stop their tracking, and when I showed up to work with the team, there were surprised that I didn't have an account.

From what I could tell, the teams are fairly isolated and thus don't see the forest for the trees. When someone points to an article like this, it seems that they just shrug it off and think the author probably got it wrong because it doesn't seem that way from the inside (again, they only work in isolated teams, but still think they have enough of an insider's perspective to discount it). Even huge companies we know now as bad had a ton of employees, like Enron.

I would really like the perspective of someone "on the inside". Facebook is one of the companies I trust the least with my data, yet they have so much talent that I can't help but wonder how they convinced them to work there (is it just the money?).


Not a employee, and I like my data to be safe and not exploited as much as anyone. But, let me try to take a swing at this.

First, obviously there is the money factor, you may choose to ignore it but it is a big factor for many.

Second, the tech is truly state of the art and it is good experience/skill to have/pickup.

Third, I'd say almost everyone passing judgement about people taking up such jobs also judge having such jobs on your resume as a positive. That drives one's value as a candidate up even post such a job. If people care so much why don't they provide an incentive, would you hire someone who turned down such a job compared to someone who gained experience in such a job? No one ever asked me - so which jobs offers did you discard and why, during any interview.

Fourth, almost every big company has its scandals, now how does one decide which ideal is worth giving up job offer for, is big financial ok? is big pharma ok? is big tech ok? is big anything ok? is working on open source in such a big something ok, say open source software others use at their jobs? is a start up using questionable practices to get to the next level ok? define what's ok according to you, and why do you expect that would be the same for everyone else.


Which technology at Facebook is state of the art?

I'd wager that for any area you pick - they probably have a lot of high-end technology relating to image/video storage, data replication, machine learning, and network-layer infrastructure - there are other more morally and ethically sound places where engineers could learn and apply that same knowledge.

We're already reaching the point where working for toxic companies is considered a negative during resume review; I won't provide any such examples here but the bay area tech scene is full of examples of environments where being a former employee at a company can at least warrant raised eyebrows.

Scandals may occur; what matters is how the organization responds to them. And yes it's certainly acceptable to leave an organization if you're not happy with the way it has handled such situations.

The only point I find difficult to disagree with in your comment is the monetary motivation.


>Which technology at Facebook is state of the art?

Say data at scale, petabytes of data for example. I'd be curious to know if you can name all companies that have this scale of data and are morally acceptable to you. :) Google? Amazon?

> Scandals may occur; what matters is how the organization responds to them. And yes it's certainly acceptable to leave an organization if you're not happy with the way it has handled such situations.

While I can see your point of view, as an engineer you can find other opportunities that may not be as lucrative but are comparably still good. But, I also find it hard that its the engineers that get this judgement regularly on HN while you give users and shareholders a free pass. A scandal surfaces, repeatedly, users and shareholders don't care, nothing changes and for some reason that's ok while engineers are expected to be the moral compass. Wonder how many judging here use instagram/whatsapp/fb and/or own stock in such companies, perhaps even have family and friends that continue to use these services but somehow, I guess, its easier to judge strangers and expect them to behave a certain way instead.


For me personally, none of the 'really' major tech companies are; I don't desperately enough need to work on the very cutting edge to trade-off against morality. But I'm not innocent either, most actions have (ideally unintended, and later rectified) negative externalities.

It'd be an interesting discussion to have with someone who feels like they really need to stay at the very peak of private data accumulation - because in my view those actions are potentially very detrimental to wider society, certainly depending on the culture. I'd extend more respect to Google than the others from what I've seen, although opinions may vary elsewhere.

Regarding scandals and reactions - users and shareholders can and do care, and they vote with their feet, or wallets, or ideally both.

The battlefield in these cases is over how much truth about the scandal and resolution are published. A good organization will generally tend towards more transparency in both, while perhaps keeping a few cards close so that they can react to any potential retaliation (such is the world of rapid fake news that we live in).

Edit: s/data accumulation/private data accumulation/


> would you hire someone who turned down such a job compared to someone who gained experience in such a job?

Yes, I would. At this point, willing taking a job at Facebook is a bit of a red flag about the potential employee's ethics.


From personal conversations with tech folk who don’t work at Facebook, I’d say it’s about the money. Most people say they don’t want to work there, but if the money was fuck-you good, they wouldn’t say no


(throwaway) I started at FB out of college relatively recently and wasn't there for very long, so I can't speak for longer-term/more senior employees but hopefully this is a bit helpful. I myself joined because I needed a job, the money was amazing, and I wanted to be in the Bay Area. When I went in, there were already questions about data privacy/elections, but this was still before Cambridge Analytica and the subsequent weekly bad news that's gone on for a year now. Personally I've never been a big FB user and wasn't that enthusiastic going in, but a) the money and b) the scale of their data/infrastructure/data infrastructure was attractive.

My impression was that many employees hold a self-contradictory view about the extent of their influence at the company. When asked about their jobs, they tell you that they're working hard on fixing the problem and making impact ("where better to fix it than from inside?"). But when confronted w/stories like Onavo, they get defensive because "it's a big company, I had no way of knowing." Which is fair, honestly; the problem is that they think they can fix anything in the first place. Part of the problem is that FB advertises itself internally as being super transparent but it isn't at all. (This applies mostly to product/data+ML people. The infra folks I worked with for the most part just want to make their money and go home.)

A lot of longtime employees joined when FB was good and amazing in the media, and it's hard for them to accept that it's really gone in a bad direction. A lot of younger ones join for the money, and/or because they're coming from FB's massive, culty college intern pipelines (especially if you come out of FBU) and confuse being dazzled by the perks with actually believing in the mission. The money is big for everyone (I was there for part of the long 2018 stretch where the stock price just fell and fell, and you could feel people getting antsy), and the defensiveness that comes from constantly seeing negative news is another part. Lots of blame thrown around internally (leakers, leadership, bad eng practices) but little responsibility; lots of sunk cost fallacy-ish thinking ("we all took jobs here for a reason, we can't just give up and leave").


Thank you for taking the time to respond; this is insightful and informative.


Thank you


It’s really hard as a Facebook employee to engage in those topics. The probability of your comment to hit the front page of the New York times is very high. There are a lot of pending litigation so you’re also likely to be involved in those, which is the last thing you want as an engineer.


When have HN comments been featured in an NYT article? I'd say they'd be very hesitant to do that because there's no way to verify anyone is who they say they are, which is usually a requirement.


Save a trip to Twitter. Here's the link to the story:

https://www.computerweekly.com/news/252458208/Facebook-plann...


The twitter thread has a few of the source documents, and a few things not mentioned in the story, this particular e-mail seems... interesting: https://twitter.com/ashk4n/status/1099146500725063680

Also, that twitter user is a former senior leader at the FTC, and claims something here to be "textbook @ftc deception": https://twitter.com/ashk4n/status/1099164648379580416


Serious question -- how does Facebook keep getting hit with these 5+ year-old emails getting published? Did they not have any email retention policy?

Is the root cause that they migrated off of email and did all their sensitive discussions in "internal tools" with no actual data retention enforcement? If so, that seems quite ironic.


Many people keep copies of emails. Perhaps they feel compelled to make the information public, and maybe they felt more compelled once they received their RSUs and cashed them in.

"Most full-time employees receive RSUs (restricted stock units) which are shares that become sellable on a set schedule over four years. " https://www.quora.com/Do-Facebook-employees-get-stock-option...


Most of the recent documents have become public because of a court case.


Also, here's a link to the full set of documents:

https://github.com/BuxtonTheRed/btrmisc/blob/master/fb-643-e...


Where do fb recruit engineering talent ?

are we (engineers) really that indifferent to what organizations we support do ?


Their recruiters reach out to me about 4 times a year. I will say that some of the challenges and scale of the problems Facebook is solving really can't be found at other companies. You're doing data analytics, product design, infrastructure engineering, dev ops, security and software engineering at a scale that's pretty much never been done before. Also, companies like Facebook, Twitter, and Google generally define frameworks that other companies use. Think about products like Angular (Google), Bootstrap (Twitter), ReactNative/ReactJS (Facebook) — If you want to work on a project of that scale, there aren't that many options for companies to work for. I always tell my buddy who works at Turner — Your company is never going to be better than Facebook when all of the tools you use are built by Facebook. I also hear that compensation and benefits are really good, plus their stock value is going up faster than other companies in the market. If you live in the Bay Area, you need that equity boost to move up the socioeconomic ladder.

I probably won't ever work at Facebook (I have all of their services blocked in my host file and I've deleted my account and they aren't working on anything interesting to me) but I'm just trying play devils advocate and paint a picture of why someone would choose to work there.


Interesting bit is that, I wanted to counter the Stock price argument but after checking current trend it’s back to former values from last year. Well thought out arguments. cheers!


I agree with everything except the projects. It is true that Google Amazon and Facebook have projects that are exclusive. But there are lots of other interesting and widely used projects out there. Redis, PostgreSQL, MariaDB, Rust, Firefox, Python, VueJS, Linux, Apache Hive, Hadoop, etc. If you care about projects that involve the largest number of servers, you need to work for one of those companies. But if you care about your code being used by a huge number of people, there are lots of other interesting things out there.


Also you may not end up in one of those "cool" projects either even if you land on a FANG job.


All of those things apply to Amazon too, yet for some reason that’s not considered in these conversations...I wonder why


The difference is that Amazon doesn’t compensate their employees well.


They pay on the same par as other big techs. See https://www.levels.fyi/SE/Amazon/Google/Facebook


Compared to what? They're almost certainly >= 95th percentile for software engineer compensation.


That’s...debatable.

Disclaimer I work at Amazon.


This is just an anecdote but being a PhD student at a well-known university in CS I often hear from undergrads (via teaching) what they think about companies/where they go for internships.

Observationally, ethics does not even enter the equation. Especially career-driven students from backgrounds where name-brand prestige is important just want such a name-brand on their CV.

Not that faculty are any better given how many academics effectively sign away their lab to FB through 'collaborations'. All they see is resources to use, name-brand recognition, a big personal pay-check, and publicity for their work.

What I find most mystifying is that this is a literal non-topic. At best someone may forward a blogpost about a security leak and make a snarky comment, but I have never witnessed any political discussion.


I have had a quick run of an analysis - Academically vs Self trained Computer Engineers among my peers.

From the small sample, it shows very strong correlation between working in ethically questionable organisations and finishing Academia.

Is the opposite for most of my Self trained peers.


I'd argue that top bay area tech companies are recruiting CS degree holders at higher rates than bootcamp self taught types. It may be harder for self taught types to get past the HR recruiting filter so they settle for second tier companies that are more ethical.

Also someone grinding through a rigorous CS degree might want the biggest ROI vs a self taught type that is happy to take what they can get without and still get paid well without the debt of a CS degree.

I don't think we can draw any conclusions and to state the obvious, correlation =/ causation.


Is there any chance this is the result of hiring filters on the part of the "ethically questionable" organizations?


Anecdotal evidence, but that has been my experience as well.


And mine as well, and anecdotal for sure. But, if there's something to it, I'm not sure how to account for it. Though one thing comes to mind.

Some time ago, a study was performed in my home country to look into possible correlations between academic performance and work performance among doctors. As it turned out, the best performing doctors were not those with top grades, at least not before entering medical school.

Those who did best had left high school with adequate grades, but not good enough to get into medical school. Rather, they had spent time and effort with supplementary studies to get their grades up to the level where they could apply for medical school.

The possible explanation that was presented was that some of those who had great grades straight out of school simply chose to become doctors because of the promise of prestige and remuneration. Those who didn't, but still fought their way into medical school, however, had a calling beyond money and status.


Same here.


Cum hoc ergo propter hoc / post hoc ergo propter hoc.


What does "finishing Academia" mean?


I'm gonna assume graduating with at least BS, probably Computer Science.


If that's the case, the apparent correlation probably has more to do with hiring practices than ethical differences between the groups.


Do you ever have (and use) opportunities to bring up the ethical aspects? It’s becoming a more prominent thing for people in the industry, and it makes sense to discuss before people leave academia. I know it’s not your problem to solve, but I think if the university isn’t talking about the ethical considerations it’s tacitly endorsing the ‘technology is value-neutral’ philosophy that’s begun to get some pushback.


I have with friends, but with undergraduates this is a bit more difficult.

An undergrad tells me they are going to FB for an internship. I am not their tutor or mentor beyond teaching. They have gone through stress and trouble for their internship, and it's too late to get something else for the year.

The only thing I can do in this instance (where there is no chance at all they would give up their internship to do nothing) is make them feel bad about it. Uncomfortable situation.

Honestly, I think the department should step up and disinvite FB from campus events and not allow them to advertise. It should also hold courses on ethical impact of technology and discuss a few cases of misuse.


At my German university that was a topic that regularly came up around internships, job fairs, and among those currently graduating and looking for jobs, with a wide range of opinions.


I feel like it's difficult to bring it up. What are you going to do... tell your friends/students their next job is at an unethical company?

Also, people generally believe it's quite possible to work on good things in a company that also does bad things. (And many of them indeed will.) So it's not the most compelling argument that you shouldn't work with X/Y/Z because they did bad thing W.


> What are you going to do... tell your friends/students

mentors, professors and friends can have a lot of influence which (imo) could be a moral and ethical obligation one may like to exercise? If you were receiving advise or learning as a junior and looking for direction from a mentor (or any person you feel you can learn from and also trust), wouldn't you appreciate hearing their personal opinion on a subject, and how they arrived at their believes?

> Also, people generally believe it's quite possible to work on good things in a company that also does bad things. (And many of them indeed will.) So it's not the most compelling argument that you shouldn't work with X/Y/Z because they did bad thing W.

it's difficult if not impossible to convince somebody that a company they just passed their 1st interview with, to refuse because of ethics. Especially if they never had a job they might say I'll do it anyway and see for myself, I can still bail if it's that bad.

But employees already working there have more power by changing things from within. I think this is why point above is valid because everyone has mentors so speaking up (without judgement) is key. Only by changing the inside it's possible to have a dialogue about impact on environment/society and only by talking about it will we eventually be able to abolish the practice of labeling any such discussion as anti-profit or social-justice seeking. It does affect the long-term image and how the company/brand will be perceived in the long run.


YES!

This is particularly important for young professionals! It's super important for fresh graduates, who enrolled in CS/CompEng because of how enthusiastic they are about technology to hear this stuff!

Why? Because, while we're wondering whether or not ethics is even something that we should bring up in a discussion like this, Facebook has PR and recruiting departments full of smart people who are actively working on getting these folks on board.

And when you've spent the last four years of your life studying a highly-competitive field, in which there's barely any room for the study of philosophy, ethics and humanities, it's pretty hard to figure out this stuff on your own.

So damn right tell friends/students their next job is at an unethical company. I mentor interns every year, and whenever one of them asks me about companies like Facebook or Google, I absolutely tell them that I would never work there. I tell their recruiters the same thing. I tell my friends from outside the tech world the same thing.

> Also, people generally believe it's quite possible to work on good things in a company that also does bad things.

The problem isn't that you can't do good things, the problem is that these companies use them to whatabout the media away from the bad things they do. You think Facebook worked on those "Mark yourself as safe" thing out of the goodness of their heart? As if it brings them any kind of money? No -- they do it to capture a little bit more of their attention, and to point out to any reporter that questions their morale that they're totally on the Light Side of the Force, just look at how many users rely on us to let their friends know they're fine.


The Socratic method might be a viable option - ask questions. Ask if they have concerns. Ask where their hard-stop lines are. Be friendly about it - attacks don’t help things but conversations can, even if it doesn’t look like it in the moment. Bringing the topic up and making it something people talk about will actually start to help.


> What are you going to do... tell your friends/students their next job is at an unethical company?

Yes.


I refused a job interview with them for ethical reasons, but I'm not blamming people accepting. Morality is an abitrary line. My phone is probably made by slaves, my clothes by children and my bank account probably invest in arms companies.

It's good if one can take the opportunity to take a stand. But let's not pretend we are parangon because we do.


Given the widespread apologism and hand waving around Google, Facebook and others here this is hardly surprising. Decision makers shoulder the blame but what we are seeing from these revelations is rank and file tech folks are 'eager' and willing enablers.

Ethics is reduced to meaningless posturing if whenever it comes to action people have a litany of excuses on hand. In the case of surveillance whether its government or private its individuals empowering themselves at the cost of others not particularly burdened by the wider social ramifications.

It's a bit tragic that its only when without power that people talk of ethics and even with little power it seems self interest always rules. And discussions become pointless as it becomes impossible to tell how many will do differently.


It depends. If you have a variety of employment options, it is easy to follow you “conscience”. When you have fewer options and more personal responsibilities (private school tuition, piano lessons, etc), you start to lose the luxury of working for less pay.


Hire young. Their ethics haven't evolved yet. [0]

[0] Not a condemnation of the young, but they still don't realize what they don't realize at that point.


They offer the top salaries, and they have 2B+ users.


I wonder if they psychologically test for and target these traits during recruitment.


After their internal memos about valuing "connecting people" over anything else (like safety) leaked[1], some of the comments by Facebook employees were almost cult-like[2]:

> Leakers, please resign instead of sabotaging the company

> How fucking terrible that some irresponsible jerk decided he or she had some god complex that jeopardizes our inner culture and something that makes Facebook great?

In particular, this included comments about wanting a loyalty test "screen" in their hiring process:

> Although we all subconsciously look for signal on integrity in interviews, should we consider whether this needs to be formalized in the interview process?

> This is so disappointing, wonder if there is a way to hire for integrity. We are probably focusing on the intelligence part and getting smart people here who lack a moral compass and loyalty.

[1] https://www.cbsnews.com/news/facebook-memo-the-ugly-andrew-b...

[2] https://www.theverge.com/2018/3/30/17179100/facebook-memo-le...


That last one really drives me nuts. Isn't having a moral compass almost the opposite of loyalty (at least in cases where both are involved)? Maybe not 100% opposites, but enough that it would seem they don't belong together like that...


It's interesting that they use the word "integrity", when I would consider integrity standing up against your employer (if they are doing something bad). They seem to be conflating "integrity" and "loyalty".


This would seem to make the most sense, as it's the same line of thinking around why only 1 out of thousands at the NSA would be a whisteblower (see: Snowden); despite the shady shit that they were (and probably still are) undertaking.

Also, non-disclosure agreements really fuck people in the overall scheme of things. If you worked for Facebook and came out tomorrow showing some really shady shit that they were doing, it's conceivable that no other company would ever want to hire you. (I think the term is "black-balled" but not sure if it's still in use, today?)

(Whistleblower laws only protect government workers and do not affect the commercial industry. Even then, the whistleblower laws only go so far...)

So, you're damned if you do and damned if you don't.


> you're damned if you do and damned if you don't.

If having integrity and an interest in the public good aren't strong enough arguments to avoid working for these companies, then perhaps this one is.


>Facebook, which charged 30% service fees on the transactions, revealed in the internal email, “that an overwhelming majority of Apps using Facebook Payments to solicit funds are likely fraudulent.”

No surprise here, again. Facebook launches a feature that allows criminal behavior, reaps the profits, and only tries to roll it back after they are under intense scrutiny.

There is an easy way to fix these repeated problems: Hire people who care about ethics. Of course, the fact that they don't shows that they are only looking for a rubber stamp.

What kind of "privacy" team do they have that approved all the garbage listed here?


I work at one of Zuckerberg’s charitable organizations, but I’m considering resigning because of this kind of thing. At some point it’s not possible to separate the charity from the source of the money.


I think Computer Science needs to come coupled with exposure to a philosophical ethics sub-curriculum. I run into too many intelligent people who haven't ever spared a thought toward how to recognize "what is right? What is wrong?"

Philosophy may not equip you with a concrete answer, but it definitely equips you with the machinery to look at something and recognize there's more to the question than some manager saying it's legit.


Yeah, everyone hates Facebook, but is there anything new here? Targeted advertising based on location and relationship status? That's not a secret, that's one of the biggest selling point of their product. Can someone explain to me why should I be outraged?


I've often argued that the basic "dumb fucks" sentiment from Mark Zuckerberg's university days has not changed within Facebook's leadership and frequently got push-back that Mark Zuckerberg has matured and is now a responsible steward of our data.

I feel vindicated by news like this. It's pretty clear that privacy overreach and violation at Facebook are not "accidents". It's standard operating procedure.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: