Running https://neocities.org I have a lot of problems with Safe Browsing too. The way it works is secretive and arbitrary, and there's absolutely nobody at Google (except a few friends which I hate bothering) you can talk to about it when they make mistakes (which is quite often). They also don't give you any way programmatically manage reports, I've been asking for an API for years and I still have to do everything manually through their crappy UI. It's one of the biggest worries I have, that they will make some arbitrary mistake that blocks millions of people and there's nothing I can do about it and there will be nobody to talk to. Even their "support forums" are just answered by weird non-employees that have decided to give Google free tech support for some reason.
Google still hasn't figured out that the web is their content providers and they need to support them, and treating their producers with contempt and neglect is a glorious example of how stupid and shortsighted the entire company is right now about their long term strategy (how many ads will you sell when the web is a mobile Facebook app?). They should as soon as possible, as a bare minimum, start providing representatives and support for the content providers that make people actually use the web and help them to be successful, similar to how Twitch has a partnership program.
There is a fascinating class system for users of Google products, from the biggest GCP spenders who are treated immaculately, to free-tier youtube consumers with adblockers who only exist to be A/B tested and tracked.
This “shoot first ask questions later” approach is typical not just for Google, but for AV vendors as well. The problem is that in the case of Google the potential consequences for a legit developer are much scarier: all browsers are affected, the whole website may be blocked.
The other, even more serious issue, is that with Google there is no way to get the feedback and learn what exactly they consider wrong?
Let’s also talk about their “unwanted software policy” that leads to this kind of warnings. The policy mostly sounds okayish, but there are some points that bother me. For instance, there is a point explicitly prohibiting working with Google APIs in a nonstandard manner. What does that even mean? Is ad blocking software blocking google’s ad server can be categorized as unwanted now? Or any AV that scans Chrome downloads?
People would have a lot fewer issues with Google if it actually had support contact that functioned. It is awful that they just kill a thing without even contacting the author but it is made so much worse by the utter lack of willingness as a company to even accept the possibility they made an error.
These sorts of problems are the reason I stopped doing Android and OS X apps, being that beholden to companies who have shown time and again to do not just wrongfully takedown apps but also act anticompetitively in their marketplaces, it is just asking for trouble. Stick to games and website apps and avoid genuinely innovative takes on anything that Google (and Apple) has any remote interest in or your business might just disappear overnight and you won't have the funds to stop them.
This sort of protection is just messed up but I can't see the USA taking action against this giant of a company any time soon. You still see people complaining about EU fines for anti-competitive behaviour regularly here on HN and yet they are a drop in the bucket. The web isn't free anymore and those that want their data to be their own went underground into the self-hosting community.
Specifically, this is a result of the imprint laws in germany, which require you put up contact data. An E-Mail Address is optional but if you put it up, you need to respond to incoming requests (mostly of legal nature). The automated response wasn't considered by the court to be sufficient.
Consider the scale at which they operate... How could they possibly turn a profit given the number of people who use their free and paid services, and the number of people who are pissed off at their business practices? I know I want to call and harangue them at least once a month... Imagine if every single Gmail user could do that. They would crumple over their own tech support costs in just a few years.
They could afford it. It would be expensive but they are so hugely profitable that the cost of a large customer support team would be a small drop in the bucket of incoming cash.
There needs to be accountability. I can't put out notices accusing random businesses of being scams without opening myself to some sort of libel or defamation lawsuit. The same should apply to security vendors that falsely mislabel other peoples' software as being malicious without good cause.
That is just silly from both a security standpoint and a libel and defamation standpoint.
They are specifically heuristic algorithm based and by definition they aren't perfect but can actually keep up with the massive volumes and unknowns. It is /not/ random or capriciously applied even if outcomes are flawed which makes the comparison apples to hand grenades regardless of false positives.
Even if their approach utterly sucked it wouldn't be libel or defamation any more than saying "Never trust a company which changes its name without the old name featuring sonething which becane obsolete." or "Any company with a rate of growth over 10% for ten consecutive quarters is probably a Ponzi Schene".
No, that's not how things work. You don't get to apply a defective algorithm to a problem, and then claim that you aren't responsible for the results since the algorithm that you created is defective. Google has a very high false-positive rate with this.
Any time they make a statement about another entity which is demonstrably untrue and causes harm to somebody's reputation or prevents them from generating revenue, that is grounds for a defamation lawsuit. The fact that they have automated their defamation does not remove their culpability.
It might also be a violation of anti-trust law since Google has its own software-delivery channels which are not subject to the same warnings.
>Even if their approach utterly sucked it wouldn't be libel or defamation any more than saying "Never trust a company which changes its name without the old name featuring sonething which becane obsolete." or "Any company with a rate of growth over 10% for ten consecutive quarters is probably a Ponzi Schene".
It's completely different because those statements are not made about specific entities, nor do they make specific accusations about any entities.
This has nothing to do with Google's monopoly. MS Windows will also warn about unsigned executables downloaded from the Internet, at least until these have become well-known enough. Reproducible builds will definitely help with this, both by establishing social trust in your release and by having a single version of the binary that will eventually stop getting these warnings.
Apple has similar issues these days, with their weird "app notarization" requirements that may even require you to pay the platform vendor in order to be acknowledged as a "trusted" developer.
I think the author’s point is that Google’s standard for what constitutes a trustworthy download creates a barrier that may prevent the new app from gaining a userbase large enough to sustain itself and get onto Google’s safe browsing list. This is the definition of limiting market access.
Google’s standards are arbitrarily set and applied, with no evidence of community involvement in setting those standards.
This describes the precursor to monopolistic behavior.
Microsoft is very open about how it works: get an EV code signing certificate from any vendor to remove the warning entirely, or have enough users click through the warning (and getting a cheap "regular" code signing certificate leads to a less severe warning if I remember correctly).
Google is completely opaque, there is no documented path to get rid of the warning.
A code signing cert only removes the warning during actual installation, after you've already downloaded it. For new software, Microsoft still flags it as "uncommon", even if a code signing cert was used.
Not sure if an EV cert makes a difference as you say, but they are certainly prohibitively expensive.
SmartScreen is Microsoft's reputation system used both in IE (since IE9), Edge, and Windows 8 and 10. [1] is an introductory blog article about the system in general, revealing among other things that both the executable and the cert it's signed with gather reputation. [2] goes into some more detail on code signing, and [3] talks about EV certs (basically they are a massive reputation boost, both starting you higher (high enough to bypass the warning) and letting you gain reputation faster).
A regular code signing cert can be bought for about $6/month, and an EV cert for about $25/month (if you shop around or buy multi-year certs). Both are expensive in the context of open source, but I'm not sure I'd call the cost prohibitive.
I got a 4-year cert from K Software for $234 (which is < $5 month). Of course, I had to pay for 4 years up-front, but TBH I'd much prefer that, as the verification process is a series of time-consuming farcical hoops to jump through.
The reason I state that it does is because at least Chrome (and its derivatives, which make up 70% of the browser market) and Firefox rely on Google Safe Browsing to flag downloads, and Google controls 92.42% of US search users.
Having one's website and/or downloads flagged as harmful, and potentially being deindexed for hosting malware, is not something any software developer can ignore.
If Bing and Edge flagged my downloads, I would honestly not care as they control 2% of the market. The Windows "this file was downloaded from the internet" warning is something that, regrettably, is so common that users already ignore it, and it happens even for many commercial software programs. Although I do consider that an unfortunate hurdle for free software developers as well, the harm is substantially smaller.
> MS Windows will also warn about unsigned executables downloaded from the Internet
OK, sure, Windows adds an additional problem, but the Google problem still needs to be solved.
> Reproducible builds will definitely help with this, both by establishing social trust in your release
"social trust" is not (so far as anybody knows) a metric that Google uses to decide whether a particular download is malicious.
> and by having a single version of the binary that will eventually stop getting these warnings
Only a single version of the binary was uploaded, and "eventually people will stop getting these warnings" only solves the problem for the current release, not the next one or the one after that.
My larger fear is how opaque the process is: will it just continue to flag my downloads as being harmful? Or will it eventually escalate to blocking my entire page/website as I've seen happen to others? Will it result in search penalties if I continue to release new software? How can I make these warnings go away permanently? None of this is explained anywhere that I can locate. All I have to go by are the scary warnings asking me to "secure my website from future attacks" and that their review found I "no longer" host harmful downloads now that I've removed my own, safe software from my website.
Sadly this is really one more line of defense - perhaps one of the last lines of defense - against the pervasive threat of malware which has massively eroded trust in downloadable binaries. This has been going on for decades, but the rise of money-making ransomware has given criminals a powerful new profit motive for making and distributing malware every possible way they can.
Let’s take some examples from the post. If a domain is 15 years old and implicitly trusted by this point, then an attacker is just going to compromise an ancient WordPress install to post malware.
If an OSS developer occasionally releases software, attackers might approach them to add a new SDK or monetization opportunity (happened many times to VLC - good thing JBK hasn’t been tempted!), or just straight up attempt to compromise their infrastructure (e.g. download servers, has happened to many pieces of software like Transmission).
If we actually had Let‘s Encrypt for code, attackers would trivially get certs for their stuff. Then end-users would have to decide which certs to trust or not, which would significantly weaken the purpose of code-signing.
Short of just sandboxing all binaries by default, I don’t see a great solution to make binary downloads safe. macOS is already moving very heavily in that direction.
Or maybe the vulnerabilities should be handled at the OS level with confirmation of privileged access at the user level. If I download a notepad app and it asks me record then that is something that I can respond with.
The OS vulnerabilities being exploited should be fixed. The solution of a white list solution managed by unaccountable tech oligarchs should be laughed at and then resisted at all costs.
Yes, this is more or less sandboxing. The fact that a freshly downloaded binary has r/w access to all of my files should be considered a vulnerability nowadays. The macOS approach, in which a binary only gets access to files that are explicitly selected through the OS’s file chooser, is a good first step. The downside is that it can get very annoying for the user. An ideal solution for desktop machines still doesn’t quite exist.
Because the internet is such a scary place, Google should get to decide which software gets to thrive? I'm sorry, but your reasoning doesn't make sense to me at all.
Yep, letsencrypt handles peer to peer encryption not authentication which is supposed to be the point of codesigning. Not that codesigning helps that much either - you still get big scary prompts on Windows after signing until you generate enough downloads to gain further trust. It’s all a mess.
Certificates are precisely for authentication, but the simplest form employed by automatic systems like letsencrypt only tries to verify domain control. It does not bind the certificate to any legal identity, unlike stronger forms of certificates.
Once again, the title has been (ungrammatically) editorialized with a question mark. It would appear that this is HN editorial staff policy[1]
What is the purpose of this policy, and in particular what criterion is being applied here, that does not apply to (e.g.) the (considerably more subjective) title "McDonald's holds communities together" which remains un-editorialized on the front page of HN?
I am similarly curious about the difference between "Apple News No Longer Supports RSS"[2] and "Google bans niche browsers from Gmail"[3] (to my eyes they are virtually identical submissions - third party forum-based verifications of changed behavior of Big Five software).
The question mark is absolutely unnecessary, and gives credence to the idea that the site admins are somehow biased in Google's favor. I would like to see this stop as well?
This problem also occurs for "downloads" that are generated purely in javascript. I see this warning in the search console for BulkResizePhotos.com and (as the article mentions) even after having it reviewed and passing the review, the warning comes back within a week or two.
Dreading the day that Google decides to turn this warning into a search penalty. The other image resizing websites typically upload the images to their server and download the resized ones back again. That sucks for users, who have to wait for all that to happen and have the privacy of their images put at risk.
1. Create a DNS record for subdomain and name it to something like downloads.byuu.org
2. Put a file there. The preferred naming scheme is to categorize it by product or product category, so something like downloads.byuu.org/emulators/higan.zip will do fine.
3. Start by putting the downloads in non-executable file formats first, e.g. use higan.zip instead of higan.exe. It's a no-brainer for a Windows user to launch the file from ZIP archive
4. Make links to downloads from your main website as you would usually do
5. Everything should be smooth now
6. Your downloads.byuu.org subdomain will slowly gain reputation
7. Once it has an established reputation, you will be able to put .exe files there. Gaining the reputation will take about several months. I would expect about 12 months to be on a safer side
My main website has a domain authority of 53; a subdomain with the same IP would be seen as the same site (I used to use subdomains and have seen this firsthand); a subdomain with a different IP would start me over which is a very bad thing (I've been struggling with byuu.net in this regard.)
The URL scheme is the convention I follow, but I also include the version# in the file name.
I did put the executable inside of a ZIP archive, along with a text database and a few video pixel shaders. One change I made recently was moving from .7z to .zip since some users don't have 7-zip installed, but I presume Google is smart enough to scan inside 7-zip archives even if Windows isn't (out of the box at least.)
I've been providing these releases for fourteen years now. I have no idea what suddenly changed other than it took about two years to release a new version due to a lot of massive changes.
I appreciate the reply all the same, thank you for taking the time.
>a subdomain with a different IP would start me over which is a very bad thing
Cannot confirm that. IP is virtual thing in terms of HTTP web hosting. I have experience with attaching fresh IPs to existing domains and attaching fresh domains to old IPs: the only thing that matters for HTTP reputation is DNS. IPs do not really matter unless they are seriously blacklisted by a manual action (which is not your situation I presume).
>Google is smart enough to scan inside 7-zip archives
Google sees raw EXE files as a risk factor. Once domain serves a naked EXE, Google gives it a higher risk score.
Publishing an EXE file inside archive (of any format) significantly lowers that risk because an archive cannot be directly executed by OS.
(Please note that a lot of corporate internet gateways do not allow naked EXE files via HTTP for the very same reason)
Well it would be both, right? A new IP and a new subdomain. When I did that for a wordpress install it had very low (<10) DA. I trust your experience more than my one attempt though.
Also my program was inside a ZIP archive. It did not help me.
They probably calculate different scores depending on the purpose (SEO, safe browsing etc).
If you want to stand your ground you could simply add a password to your archive (works best with 7zip as you can avoid exposing the filename with extension); just mentioning it for the records.
I maintain a small project and this works fine for me (though I do directly host executables). The bigger problem is crappy "anti-virus" software flagging it but those get resolved after a release is a few weeks old or so.
You can get code signing certificates much cheaper than that - I paid $234 for a 4 year certificate from K Software (IIRC, they are a Comodo reseller).
Yes, it's still money that cash-strapped OSS devs might not have, or indeed might not want to pay on principle. And yes, the validation process is a total farce and a PITA.
But I don't think it helps the argument to use the most expensive certificate they could find as an example.
Also, having a cert does not mean your software won't be marked as "uncommon" - presumably Google (and Microsoft) use a certificate as a signal, but it seems only the number of downloads really counts. And I do agree with the thrust of the article, that this harms OSS and indeed small businesses.
First off, thank you for bsnes. Have used it a lot in the past.
It's a tough call, but it's somewhat understandable what Google is doing. Arbitrary binary downloads from arbitrary websites are for the most part a problem for the majority of users.
There is nothing stopping a savvy user from still finding and downloading your binaries. You should probably figure out a better distribution channel.
Remember that all the hoops you have to jump through to get a signed binary are also required for anyone who would want to pirate your software and re-release it with a virus ( which I have seen done in the emulator community before ).
Needs smart regulation. I can't fathom many governments smart enough at this level of granularity. Maybe when x-ers ore millenials are fully in charge.
The simplest regulation could be: all policies have to be transparent, consistent, with predictable outcomes, and there must be a process for addressing grievances.
Then Google would have to hire 20 000 support people and act like a regular company.
The people who want to be in charge will be the ones that want consolidation of power and authority. Or future generations are not our saviors in this regard.
"For instance, I wonder what would happen if we were to ban all mergers and acquisitions involving companies above a certain size."
This is de-facto the case. Mergers are monitored by the FTC etc. subject to a lot of scrutiny.
Unfortunately, even with this - it's really, really hard to define what monopolies and anti-competitiveness really is.
Some think Disney should not own distribution, but Apple is also vertically integrated - and distribution channels are so volatile it's hard to regulate.
It's possible things might settle down in a few years and we might be able to establish boundaries.
>This is de-facto the case. Mergers are monitored by the FTC etc. subject to a lot of scrutiny.
No, it's not the case. Only mergers between two large companies are monitored and then they are allowed to go through most of the time.
What I'm talking about is banning all M&A (and even certain asset purchases) where _one_ of the companies involved has more than, say, $20bn revenue (or some industry specific metric). Large companies would only be allowed to grow organically.
Obviously startups and VCs would hate the idea, because it would block one of the most favoured exit strategies. What it would mean is that startups would have to sell themselves to medium sized companies, join forces with each other, and/or go public and compete with the giants.
Yes, it's 100% the case that the FTC reviews all M&A of sizes substantially smaller than $20B in revenue. There are reporting requirements, and it's something like if the merged entity has >$200M assets (I'm not sure of the details but something like that), they have to report the merger to the FTC beforehand.
The FTC probably should allow most mergers to go through.
It would likely not be efficient for companies to not be able to acquire one another beyond a certain scale, I suggest deference should be given to the liberal side of the equation, with regulation affecting only within certain constraints.
The hard part really is defining those constraints, and determining what constitutes anti-trust.
AWS massive subsidy of their delivery operations putting FedEx out of business by shipping for less than cost would be ... problematic. Taking Search profits and giving away Android for free is a form of dumping. But then without this, some entire industries might not exist!
>Yes, it's 100% the case that the FTC reviews all M&A of sizes substantially smaller than $20B in revenue.
Maybe so, but it is not the case that mergers get blocked purely based on the size of the companies involved. It's simply not lawful for the FTC (or other regulators) to do so.
>The FTC probably should allow most mergers to go through.
That is the status quo and it is clearly unsatisfactory in some areas.
>I suggest deference should be given to the liberal side of the equation, with regulation affecting only within certain constraints.
I completely understand why you are saying that, and it has always been my preference as well.
But the problem is that these constraints have become so difficult to specify that the likelihood of ineffective, counterproductive or abusive regulation has risen dramatically.
That's why I'm wondering whether it wouldn't be better to accept that size itself invevitably creates problems that no case by case game of what-a-mole will ever solve.
We need simpler rules that can be consistently enforced.
I'm far from convinced that my particular idea is any good. I'm just putting it out there as an example for the kind of simplicity that I think we need.
What's the problem here? Google is being honest. It's up to the publisher to convince the user that they are trustworthy enough to bypast Google's warning.
They can explain the issue on a web page gating the download page.
What alternative is there? Even if I trust you, how do I know a hacker hasn't cloned your site and added malware?
Let's face it... the average user should not be downloading executables from the web ever these days, except to use Safari or Edge to download Firefox or Chrome. (Or a handful of trusted brands like Adobe, Microsoft, etc.)
There's no trust, accountability, or security. Instead, app stores and package managers provide these things. They're not perfect, but they're waaay better than totally untrusted binaries.
And if you're an advanced user, you can ignore the warning. Or know to download binaries linked from a project's GitHub page, etc.
Let's face it: the "open web" is not a secure or trustworthy place for downloading binaries period, unless you're on a well-known trustworthy site (again -- Mozilla, Microsoft, Adobe, etc.).
- First, I disagree that every program an average consumer might want or need is available on an app store.
- Second, I strongly disagree that app stores provide security, trust, and accountability.
App store security is really bad. At best, we have Debian repos, which are clean-ish mostly because nobody cares about writing malware for desktop Linux so the moderation is much easier. At worst, we have Windows store and Android. These platforms are not effective at screening out malware, because content moderation doesn't scale to these levels, and blocking malware is just another form of content moderation.
Telling people to trust app stores and not downloaded binaries is like telling them to trust Amazon and not Ebay. You're right, there is technically a difference, but the difference is not big enough to matter. If you download random things from any source, you will mess up your computer. It'll just happen faster with downloaded executables.
There is (unfortunately) no shortcut to get around teaching people about security. At some point, native platforms will catch up to where the web was 10 years ago and start doing a better job of sandboxing executables, and then the job of educating users will be easier. We're just unfortunately living in the world where that hasn't happened yet.
"Get rid of unofficial software" is counterproductive to what we actually need to do -- to update our native permissions and security models to match modern users' requirements. But even though mass-moderation is a band-aide fix that doesn't even work well right now, it's heavily promoted by companies like Apple, Google, and Microsoft because under the guise of security it gives them a new stranglehold over the common-user software market, which was traditionally un-monetizable by them.
Originally on Windows the app store was coupled to the sandboxed application model, on the theory that users would learn to associate installation from Store with safety and reliability-over-time. For better or worse, they were gradually decoupled over time and now Store accepts unsandboxed Win32 apps and sandboxed UWP apps can be installed from the web or otherwise outside the Microsoft store.
Maybe so, but it's still significantly better than native binaries.
> Telling people to trust app stores and not downloaded binaries is like telling them to trust Amazon and not Ebay. You're right, there is technically a difference, but the difference is not big enough to matter. If you download random things from any source, you will mess up your computer. It'll just happen faster with downloaded executables.
The difference very much does matter. I suspect many people on this website have had the same experience as me: I had to do frequent "maintenance" on my parents computers because they get filled up with IE toolbars and whatever other BS they could find to screw up their computers. After the switch to phones and app stores, this doesn't happen any more.
People without family members capable of fixing that sort of old problem are both (probably unconsciously) grateful for the app store takeover, and vastly more numerous than indie software developers grouching about not being able to run any code they like on anyone's computer anymore.
I do tech support for multiple family members, and I have a policy about this. If I trust someone enough to hand them an Android app store, I also trust them enough not to download malware from the open web. On the other hand, if I don't trust someone to download software off the Internet, I also don't trust them with an app store.
There's a fair amount of anecdotal evidence there, I can't give you hard stats to back that up. But I suspect a lot of the "app stores improved security" anecdotes people have are actually due both to family members just slowly getting better about security in general, and (to a greater extent) the fact that phones are doing a better job than Windows/Mac of embracing the web model of sandboxing applications.
> because they get filled up with IE toolbars
This example in particular makes me smile, because I have family members on Firefox today, and they still end up with random malware/adware extensions, they just install them from the official store. It does nothing to help -- I've asked them how they got installed, and they don't know where they came from. Websites just asked them to click somewhere, and they did.
Firefox has gone through all this trouble to make sure everything has to be signed and vetted, and it has made no difference at all to my family members :). What they should do is move the extension locking capabilities from the Enterprise version to the regular version, so I can set up Firefox with a few extensions and then freeze it so that nothing can be installed, even from the official store.
Chrome's app store isn't any better[0]. Anecdotally I have roughly two options when I set up someone's computer. Either teach them about security and harden the platform itself, or make it hard for them to install any software from anywhere (usually by moving them to something like Linux and manually handling all of their setup). I haven't personally seen any evidence in my tech support stories that official app stores are helping my family members.
Yes, the frequency will go down. But this is an area where the gains have to be more drastic to be worthwhile. The support frequency only matters for trivial malware like adware and crypto-miners. It doesn't matter for stuff like ransomware, password theft, or phishing attacks. And the gains today are probably about as good as they are ever going to get. Universally, moderation gets worse as systems scale. Android has more malware because it's a bigger platform. NPM gets more malware because its the biggest package manager. I very firmly believe that app stores don't scale, because we can look at app stores today and see that they're not scaling well. It's a security dead end.
The thing is, there's a big gap between "mass-market software that can budget for the friction of an app-store release" and "niche software that only advanced, technical users know or care about". It's perfectly possible to have niche software that only, say, accountants care about; or niche software that only bee-keepers care about.
It would be really nice (in the "this is why we can't have nice things" sense) if there were a way to distribute software tools to niche audiences without them necessarily needing to be technical audiences. Up until recently, the Open Web has been that system, but it seems Google can unilaterally disable it via their Safe Browsing feature.
This would be true unless the stores you are talking about were not imposing countless limitations. Microsoft store is for UWP apps only, Mac AppStore is for sandboxed apps only, and I’d better not even start with Play store and iOS AppStore restrictions.
We need independent stores which purpose is to prevent security issues and not impose additional limitations.
That's the whole point - the average user should be able to run code from random third party developers only in a sandbox.
Preventing security issues requires imposing additional limitations, you can't have your cake and eat it too. Either the apps require serious review (i.e. a much, much higher bar of entry than iOS AppStore), or the apps need to be restricted so that they can't do much, so strict sandboxing.
Running unsigned, unsandboxed binaries from a small developer is a big security risk which can't be prevented in a cost-effective manner, so average users have to be warned that it might as well just encrypt their files for ransom.
Yet you nearly always blindly download and execute code when you browse, in overly complicated software in which dozen of vulns are found each month, with little hope of that slowing down.
Yes vulnerabilities are found -- nothing in this world is 100% perfect -- but in practice, running JavaScript in your browser is orders of magnitude safer than running binaries with access to your filesystem, hardware, and more.
Yes and the security mainly comes from the sandboxing, not from some hypothetical stores of websites making random judgments (also I also know that the reputation and blocking of website could depend on the presence of hostile JS, but given what is described here Google judgment seems to depends on quite "random" factors, that should also not be relied upon for JS content)
If you want to go toward security, unilaterally disabling parts of the web because of parts of third party legacy OS design is not the way to go, actually I don't see logically how the described approach of Google would yield any interesting true_malware_blocking / false_positive_blocking ratio. And you know what would be even safer? Shutting down the computer when the user attempts to browse the web. Browser vendor should let the OS antivirus take care of its own business -- if I want my complete computer to be taken care of by Google I can go buy a Chromebook...
I was saddened by the move of Edge to Chromium engine, but honestly while I used the old one from time to time for very specific purposes, I could not recommend it to anybody. I will still try to make people use FF by default, but I'm starting to think the position of the new Edge will be interesting and it could be good to try to switch some from Chrome to Edge.
In other words, whatever brands you know from life experience are a legit product, not malware that will turn your computer part of a botnet. If you're a designer then you know Sketch and Adobe, etc.
I do not trust Adobe, Microsoft, Google, or Mozilla for that matter.
Your idea that they are "trusted" is laughable, and it is also dangerous to our liberty and security to believe only these gilded companies should be authorized to say what software is allowed on a given platform
We have seen massive censorship as a result of this, with Apps being banned from various app stores not for security but for political reasons
Agreed, and even if there wasn't a Chrome warning, there is then the Windows warning (orange, click yes to bypass), or if dangerous the SmartScreen warning (red, need a few more clicks to really get through).
Of course counting forks as part of a monopoly is nonsensical as a metric in the first place. It would be like calling Burger King and all fast food using order counters and drive throughs as a monopoly on burger restuarants. I wonder how long until we see a "It is not RICO" counterpart to monopoly given how en vogue it is in talking points.
"Make it cost money" is, unfortunately, the first line of defense when dealing with bad actors. This is why some folks get prompted for SMS 2FA if the ML model thinks they're suspicious: a cell line costs Real Money.
Microsoft, Google, and Apple all require certificate signing for software to show up as "trusted" ($350/year is really really annoying, but it is an insurmountable wall for someone distributing hundreds of bad apps). Google's approach lets popular free software get a pass without having to pay, but, yes, it's a trade-off.
In my opinion, the easiest thing to do is to (1) put the windows binaries on a separate domain, (2) provide screenshots (not links) telling people how to download them from the other website, (3) include screenshots of how to bypass the Google warning, and (4) include instructions on how to verify the authenticity of the binary out-of-band (checksum, etc). This matches how folks handle other unsigned binaries (for example, drivers).
Firefox uses Google's Safe Browsing service to flag websites and downloads, so that would not help me. Even if not, it would be a pretty bad idea to tell 70% of visitors trying to download my software to use another browser first.
This is not necessarily a lasting fact. DuckDuckGo works just for fine and as for browsers so does Firefox. Even learning the differences in browser devtools isn't as hard as it seems. Don't presume the premise that produces a doomed conclusion.
Edit: why have I never run into this problem? Am I not a heavy app user or is this mostly on Windows?
It may not necessarily be a lasting fact, but it probably will be. Google has over 90% of the market share, and the average person is so asleep and so apathetic that they'll continue to get a warm fuzzy feeling every time they see the friendly looking Google logo.
DDG isn't going to break a monopoly on search, much as I want it to (I use it myself.) It's also browsers that may begin blocking pages and downloads outright. Basically Google is telling me now that I have to remove the links and are flagging them as a (false) security issue. I don't know what happens next or when if I choose not to, so the links are now gone.
A solution on Windows might be to make it available through the Microsoft Store. The whole process of authentication is free and the download and installation bypasses the browser.
I understand it is an awful solution. But, like in politics, sometimes an awful solution might be your less bad choice.
That may possibly be a solution for many, but Microsoft and Apple both forbid emulators (the software I create and distribute) from their app stores, in spite of repeated and unanimous court rulings establishing their fair use legality.
I still don't in general like the slow erosion of the web, however.
The web is stuffed with malware. That's not Google's fault. Our computers have more value to protect than before. That's a good thing, but it requires security. Instead of focusing on your download metrics for software you aren't even charging for, think about your users who don't know the difference between you and a thousand malware sites that look like you.
It's simply not a good idea to download and install binaries from arbitrary websites. It's been too heavily abused. It's not a reasonable strategy to have Google or anyone sort through that mess outside of systems or services that have been designed to handle this problem.
GitHub, OS and free package managers or other aggregators provide mechanisms to share trust, moderate and review posted binaries.
Consider using these. I.e instruct users to install via their package manager. If your audience is not technical then you need to put it in the common man's package manager the app stores :/
I take your point, but if you download executables from Github using Chrome, IE or Edge, you're still going to get a warning when it's deemed "uncommon".
The only real option on Windows is the Windows Store, which AFAIK is only for UWP apps.
There is chocolatey and scoop, but I find chocolately a bit of a mess (e.g. duplicates, never certain which is the "main" download), and while scoop is good, the selection is still relatively small. These are also only really used for OSS software, which doesn't help ISVs.
But that tends to come off as presumptuous in most cases.
The use here is more informal. It can't be requesting confirmation, since it's not addressed to a single person, so it's just indicating general skepticism or uncertainty. In that case it's more of a statement than a question; you wouldn't answer it yes or no.
He could just add the windows binary to the github releases[0] and link to that though. I'm not sure if linking directly to the binary will cause similar problems, but he can at least publish the binary and link to the latest release.
If it happens to flag my GitHub page, I'm not really in a better position as far as being able to distribute my software goes. It would also be problematic if other sites began linking to my releases page instead of my official page which also includes documentation, screenshots, feature lists, etc.
If I can get a confirmation from someone at Google that they will trust GitHub download links more, then I'm willing to go this route for now.
From my experience this works fine (using links to GitHub downloads), see also my other reply.
They don't give score to "GitHub accounts" but they use some form of score/accountability based on the domain. As far as I know the main vector they are trying to protect users against are emails with links to binary files hosted on random hacked servers.
> And that is to say nothing of the risks you take these days online by publishing your legal name.
I understand that you're in a risky line of “business” with emulation, where one wrong step can get you some lovely letters from lawyers. However, for the sake of argument: Is there any reason you couldn't get someone else to lend you their name so that they act under their real name for you? Surely that'd be an option for risk-averse people.
> In my own case, this has effectively prevented me from releasing compiled binaries of my own software going forward. If code signing is a requirement to distribute free software, then we need a Let’s Encrypt-style alternative for code signing—yesterday.
The whole point of a code signing requirement is to add a paywall so that only two kinds of people will have access to it: Bad actors sophisticated enough to steal a code signing certificate from someone who has purchased them.
It's a net gain for security. Software freedom, considering increasingly prevalent SaaS and closed-source apps on mobile devices, is already lost. So if we've already lost software freedom—as far as I can tell, more or less irrevocably—then we might as well at least reap the security benefit for the common person while we're there.
> However, for the sake of argument: Is there any reason you couldn't get someone else to lend you their name so that they act under their real name for you?
It's possible, but I would find it to be rather unethical. I am much more willing to allow an EV certificate to sign my software, or if I could get the BBB to respond to my requests to register with them, I could even consider purchasing my own EV certificate for my LLC. (my understanding is that the EV validation process confirms your business' validity through its BBB listing, and an article of incorporation is not enough.)
> The whole point of a code signing requirement is to add a paywall so that only two kinds of people will have access to it
Why is the web and Let's Encrypt any different? Websites execute code that can potentially harm your computer (via zero-days.) A paywall harms free software developers who can't afford hundreds of dollars a year for certificates, which is not a problem for me, but would be for many folks.
Locks and keys harm poor people who can't afford them. Filtering water harms poor people who can't afford to remove pollution. Blame the criminals, not the security providers and consumers.
> Why is the web and Let's Encrypt any different? Websites execute code that can potentially harm your computer (via zero-days.)
The web is as much of a remote code execution vehicle as it is an application platform that could theoretically do a lot of things without the remote code execution in the form of wasm/JavaScript. TLS solves the issue of people eavesdropping passively and MITM actively to do real-world harm by stealing credentials or injecting malware: It was a solution to an actual problem. Don't get me wrong, I am very much advocating for requiring TLS EV certificates if you serve JavaScript or WebAssembly once we've finished purging the plaintext web. It's a necessary evil to get more accountability for code and subsequently ease prosecution for hosting and distributing malware.
People downloading and executing other people's code is also a problem in need of a solution because of the very much non-trivial risk of malware these days. App stores have worked on mobile (at least it's improved the mobile threat landscape compared to traditional desktop computing). The idea of an app store can be made to work for desktop computers as well to reap the same security benefits of having a central, reviewing gatekeeper that is subsidized by everyone publishing there to pay a cost.
The proper solution would be to have mobile-like sandboxing capabilities on Windows, macOS and Linux, but that's still far. Mandatory code signing with personal identification is just a stopgap measure.
> A paywall harms free software developers who can't afford hundreds of dollars a year for certificates, which is not a problem for me, but would be for many folks.
I don't deny that this is a problem for many folks. But this assumes (executable) free software is desirable. It isn't. End-user software should be must be made at a loss (time) or for profit. This just makes the loss much more economically explicit. In the long run, this could give back value to software in the perception of users, which I consider to be a good thing.
> Is there any reason you couldn't get someone else to lend you their name so that they act under their real name for you? Surely that'd be an option for risk-averse people.
This just pushes the problem up a level. The front-person would assume the legal risk, and if they're trying to avoid it they will let the legal system know the "real" person.
IRL there are "goalies"[1] - indigent individuals who for a low price will assume the legal risk of, for example, registering ownership of a car. This is a grey area indeed.
[1] translation of the Swedish term "målvakt", from where I know of this phenomenon.
Google has a large share of search activity, but it's not at all clear that they have pricing power on search (the usual yardstock for a monopoly) or that search is even a market at all, since no one pays for it.
Search advertising is a different story, of course.
Google receives 92.42% of US search results. If you want new customers or users to find you, there's no choice but to do as they say. Also see AMP and media publishers.
Google earns 92.42% of those search results, by being better, at least subjectively, than their competition.
Now if you listen to the various whines of said competition, Google sure looks objectively better too, using extreme personalization to drive more relevant search results to their userbase.
There is no pressure there. If alternate search engines were subjectively (not even objectively!) better, people would switch overnight. After all, they are just a click away.
> Google earns 92.42% of those search results, by being better, at least subjectively, than their competition.
Have you tried Google search lately?
It is almost never the case in technology that the superior product wins. It's the first to market with a really killer product. It takes massive inertia to displace an incumbent, and Google managed it because search engines prior to it were nearly useless portals (Yahoo, AltaVista, AskJeeves, etc.)
The requirements and conditions to displace Google now are virtually impossible, and that's even before factoring in their massive data profiling advantage.
Some developers like to establish a relationship directly with their users, not pay (in dollars, or advertising) a middle-man like GitHub, GitLab, or an App Store.
But look at it from the other point of view: how does a non-technical user determine if a binary download is malicious?
And how, then, does Google/Microsoft/Apple protect those users from their ignorance?
Given that the internet is full of people attempting to get non-technical users to download malicious software, often my mimicking exactly the sort of site the OP has created, then is it really practical to insist that Google/Microsoft/Apple allow the OP's site to download software to a user's machine freely?
The advantage of the middle-man is that it acts as a trust agent (not necessarily well, of course). If you download a malicious binary from an App Store, that is the App Store's fault for letting it on there in the first place.
Sure, you need some kind of middle-man as a trust agent, but Google/Microsoft/Apple are not the only possible trust-agents, and their model is inherently biased towards certain useful software production models.
Let's say I keep bees as a hobby, and I write some small piece of software that tracks and calculates something to do with honey production. I post it to my favourite bee-keeping forum, other people try it and like it, and when a new bee-keeper joins the forum they're often advised by forum regulars to try my software out too.
That kind of software can be a huge help to people, but it's not a good fit for an appstore because it's never going to turn a profit, and at least on Apple's store (with the $99/year publishing fee) it'll drain money quite predictably and regularly.
A bee-keeping forum will never be trusted by as many people as Google/Microsoft/Apple, but the people who do trust it probably trust it a lot more.
You could simply host the binary on GitHub and GitLab and have the link on your website point directly to it. The user would never know, so you'd get the best of both worlds. You develop a direct relationship with your users without paying anything forward to GitHub or GitLab.
With that said, hosting the binary on those platforms won't necessarily help as Google can flag individual repos according to some other comments here.
Every indication I've read is that simply linking to the binary download, even if it's offsite, will be enough to flag the page hosting the link. So I would have to link to an alternate site hosting the link, and risk that page being blocked instead, and then I'm right back where I started.
I would, if at all possible, prefer to find a solution to this problem so that I can directly host my software.
If you prefer to directly host it then you will have to wait (and hope that no one files a complaint about your domain).
At some point your domain will have a sufficient score and it will not show the warnings to users.
How long that will take is, however, outside of my knowledge and it would be nice to have some official reference about it; as it stands I agree with you that it feels like begging to a benevolent dictator.
The point is that these warning still appear for executable downloads from Github (at least they did a few years ago when I independently thought of this as a workaround) - my guess is that Microsoft and Google are using some kind of unique'ish binary fingerprint, such as Imphash or SSDEEP (in conjunction with other signals, such as domain age, digital signature etc)
> Because the default distribution mechanism for free-as-in-speech software is Github, these days.
I surmise that this mostly impacts non-free platforms, where compiling software from source used to be quite non-trivial so it was common to just download binaries. These days you can probably get LLVM to work everywhere, though.
Each user compiling a binary is a lot of waste, have you compiled a large project like Chrome or Unreal Engine ? If you are even able to compile it successfully it takes hours on a recent i5.
A few years back at my work we were digitally sign our binaries, so I don't think non free platforms are affected at all since it is not expensive to buy a certificate , only hobby/personal projects will be affected.
Also a signed binary does not mean there are no viruses or other bad things inside, so I am wondering if we want to identify the source of the binary can we find a technical solution so you can have your website and binary with the same key then you know that binary X is from X.com people, for some reason a few years back this certificates for signing were not cheap.
It's not "waste" if it's needed to establish trust.
> A few years back at my work we were digitally sign our binaries,
Just publishing hashes for the (reproducible) unsigned binaries over a secure channel (such as HTTPS as verified by Let's Encrypt) will give you the exact same security. Digital signatures embedded in executables add no security whatsoever, and make reproducibility harder. Just don't do it.
I think it is technically possible to sign a binary and have reproducebility for the people who care, you might need a different binary/package then the current ones but it is technically possible.
We had to sign our applications to prevent scary warnings to appear and then we would have to train our users to ignore that.
The article talks about "free and open source" software in its text, so it's clear that "free" is referring to FSF version of free.
But then it talks about distributing binaries. Cry my a river; most FOSS can and should be distributed primarily in source form, in a git repo hosted on a site like github. FOSS software can be released via distributions that can supply controlled, and accountable, and digitally signed binary packages to end users who aren't skilled enough to build from source. This includes Debian, SuSE, Google Play Store, Amazon App Store, Microsoft store, etc.
Training users to download binaries from random web sites? That's a security disaster, and it's a Good Thing that web browsers discourage such reckless behavior.
This also affects freeware / shareware / people who want to avoid "store" monopolies and their associated tax. This also affects people who are used to download binaries from reputable sites, because Google did not manage to understand correctly what is reputable or not (and honestly trying to automate everything in this area is a recipe for disasters)
One alternative is a package manager like https://scoop.sh/. It's built-in repositories are vetted/curated, free to publish on and simple to install/use.
There is the downside that it's a cli-based app, but that's easily overcome with a GUI frontend add-on.
One of my favorite parts of it's architecture is that it's got a mechanism for adding third party repositories (buckets), so while the publishing policy for the main bucket is mostly limited to development tools, it wouldn't be hard for the community to build a new bucket for independent software developers to use as a general distribution mechanism.
So if Google can't understand what is reputable site or not, how do you expect the average user to figure it out? I tell my parents they should never-never-never download random binaries from a web site. The last time my Dad did it, his Windows machine got completely p0wned, and I had to completely reinstall it from scratch.
And freeware/shareware have no tax on most stores, since 30% of zero is still zero.
All of my software is either ISC or GPLv3. Source code links are available on all of my download pages. I've had to remove the pre-compiled binaries for users.
Both Microsoft and Apple ban my specific class of software (emulators) in spite of unanimous court rulings establishing them as legal under fair use laws.
Have you considered releasing the binaries as releases on GitLab or GitHub?
That's at least how I would try to work around the issue.
Regarding the blog post: I think this is a side effect of making the web a "safer" place for all kinds of users; nothing which I appreciate personally but I understand why a large mass of users clicking everything is a concern for Google/Microsoft/Apple.
I was once thinking that people could be educated to use the web in a better and safer way...now I think everyone (tech-savy users, big corporations) has given up on that and that "safe by default" is the norm.
I have, some comments here suggest those pages get flagged as well. It does me no good if I move to a GitHub page that gets blocked, and I would also have to deal with some sites choosing to link to GitHub directly, which would bypass the work I put into my site on documentation, tutorials, etc.
Yes, emulators are legal, and you can use them to play homebrew that is FOSS, but the majority of people seem to use them to play non-free ROMs which are often obtained illegally. They are useless to most without these additional non-free binaries. The major software distributors understandably want nothing to do with this. And I'm saying this as a big fan of your work. I consider having working references in software for all those old chips to be important -- but I doubt anyone else has bothered to collect every single retro game cart in existence just to dump them and say they could play them legally on bsnes.
FWIW, if you distribute your app as an installable PWA, the google guidelines are clear for reducing install friction on Windows/Chrome/Mac/Android and Let's Encrypt is your cert generator. Perhaps platform vendors treat binaries malicious by default because it so often leads to privilege escalation. Build an app in the web context and you have a cross platform sandbox with a standardized permission model. Alas, this is not very helpful advice for folks who haven't grown up developing for the web.
Apart from being a horrible vision of the future, it doesn't make sense trust-wise: The only thing Let's Encrypt verifies is that you control the domain name. It doesn't verify that what you serve on that domain is benign and it doesn't give users any guarantees about who you are.
In terms of apps, the only thing it ensures is that only the entity that published the app in the first place can apply updates to it. For binaries, you can get this level of trust by using any kind of digital signature for your updates.
The author of this article is the creator of the bsnes SNES emulator[1], which is surprisingly resource-intensive even as a native binary. Making it a PWA (via WebAssembly or asm.js or whatever) just isn't practical, at least with the current state of browser tech.
OP develops emulators (and high-fidelity ones, for that matter), these have to be native apps in order to attain the required performance. Webcrap just doesn't cut it, you can go to the archive.org games showroom and watch it peg your cpu to 100% and spin up your fans (while looking terrible and adding huge latency) if you want proof. No offense intended for archive.org, they're great at what they do. But still.
I think these layers of security against native binaries are a good thing, even though it upsets indie software developers (me included).
I used to distribute some small freeware tools for windows computers for a long time (~10 years). I stopped distributing binaries and only distribute source now (despite the fact that this likely cuts the user base literally to probably 0), because I decided it was simply impossible for me to guarantee the safety of these binaries.
I also got hassled by these security measures from MS/Google/etc., but honestly, they're right. I was making non-reproducible builds with dependencies I couldn't fully control on an insecure computer, uploading to a web host that I can't really trust, and letting the binaries sit there for months/years.
I used WordPress for a while, and it did get hacked a few times, despite keeping it reasonably up to date. I was first alerted to these hacks by google telling me they found malware on my site and were alerting people to that fact. My first reaction was obviously to be mad at google, but they were right.
Eventually I switched to a static website. But even that is hard to be fully confident in. I'm still trusting a cheap web host to keep their Apache (and whatever else) up to date. I bet cpanel is a cesspool of vulnerabilities given how janky I've observed it to be.
I suspect some or all of the above is true for the majority of the developers negatively affected by these security measures.
If you can actually be fully confident in your whole build and distribution stack, then you can probably easily afford the compliance/certificate costs to meet MS/Google/Apple's requirements to avoid getting flagged by these security measures.
Google still hasn't figured out that the web is their content providers and they need to support them, and treating their producers with contempt and neglect is a glorious example of how stupid and shortsighted the entire company is right now about their long term strategy (how many ads will you sell when the web is a mobile Facebook app?). They should as soon as possible, as a bare minimum, start providing representatives and support for the content providers that make people actually use the web and help them to be successful, similar to how Twitch has a partnership program.