Hacker Newsnew | past | comments | ask | show | jobs | submit | random3's commentslogin

IDK what Dijkstra believed in terms of how programmers should have looked like, bu the did seem to have a sense (and taste) of a direction of programming that was lost within practicing software engineering and their prefered PLs.

My own incomplete opionion is that the net effect is that we ended up writing orderd of magnitude more code than necessary to solve the problems at hand. It's the equivalent of doing the computations manually instead of using a calculator. This has led to an industry that has served us well, but strictly speaking it was never necessary and much more could have been achieved with a fraction of the resources.


While there is certainly some amount of unnecessary junk code out there, your claim that it could be reduced by an order of magnitude isn't even close to correct. In general the only way to write less code is to use higher level abstractions. The problem, of course, is that those abstractions are always leaky and using them tends to make certain required features too slow or even impossible to build at all. There is no free lunch.

as programmers we like to use all this jargon like "leaky abstraction", but never bothered to understsand it beyond the PL paradigms we use. There's no formal definition and simply makes them good terms to abuse, and throw in conversations to make our points.

Why are the abstractions leaky? Are all abstractions leaky? Why - we simply accept the situation without spending any real effort.

"There's no free lunch" - this is representative of the level of argument in software circles entirely. But WTF does that mean? If the lunch is not free, how cheap or expensive can it get and why?

This is why, as engineers, we tend to brush off the Dijkstras as arrogant, while at the same time ignoring both our arrogance and ignorance.


A leaky abstraction is like obscenity: I know it when I see it. It's impossible to define the concept in a rigorous way, and yet it impacts everything that we do.

You're simply wrong to claim that we accept the situation without spending any real effort. In reality the more experienced developers who build abstraction layers tend to spend a lot of time trying to prevent leaks, but they can't have perfect foresight to predict what capabilities others will need. Software abstractions often last through multiple major generations of hardware technology with wildly different capabilities: you can't prevent those changes from leaking through to higher levels and it would be foolhardy to even try.


I understand your position and I think it's the norm. Yet I find it difficult to comprehend how it's not self-evidently absurd.

Do you feel like software transcends pyhsics, mathematics and logics? Because that's what the statement translates to.

The only reason it's impossible, is because nobody tries, because trying to do so would interfer with the deliverables of next sprint. The software industry has painted itself into a corner.


Physics is full of leaky abstractions. Solid? Leaky abstraction (melting). Ideal gas? Leaky abstraction (van der Walls). Molecule? Leaky abstraction (chemical reactions). Atom? Leaky abstraction (ionization, fusion, fission, alpha and beta decay). Proton? Leaky abstraction (sometimes you have to care about the quarks).

Check out Urs Schreiber if you want to get over it

"Software people are not alone in facing complexity. Physics deals with terribly complex objects even at the "fundamental" particle level. The physicist labors on, however, in a firm faith that there are unifying principles to be found, whether in quarks or in unified field theories. Einstein repeatedly argued that there must be simplified explanations of nature, because God is not capricious or arbitrary.

No such faith comforts the software engineer. Much of the complexity he must master is arbitrary complexity, forced without rhyme or reason by the many human institutions and systems to which his interfaces must conform. These differ from interface to interface, and from time to time, not because of necessity but only because they were designed by different people, rather than by God."

- Fred Brooks, No Silver Bullet


No, I feel like software developers are unable to predict the future. Mathematics and logic aren't much help with that and physics barely enters into it.

>"There's no free lunch" - this is representative of the level of argument in software circles entirely. But WTF does that mean?

You cannot have a thing without doing the work to build it. You don't get the better abstraction without implementing it first. Your proof in theory, is just that, until exercised, and the divergence from the ideal to the real world is finally realized. I can teach a programmer all manner of linguistic trickery to allow them to exploit all sorts of mathematical abuse of notation. None of that makes a cotton-picking, salt-licking bit of difference if at the end of the day, if your symbolic proof isn't translateable to a machine code that runs and maps successfully onto the operational space of an implementation of a computing device. If you give a program written in the form of a Shakespearean sonnet (an example of a focus on radical novelty in encoding a program without regard to analogy); say; I still need a bloody compiler that'll turn that into something that is capable of running within the constraints of the machine, and the other primitives to make it work. That's TANSTAAFL. You break from what exists; you still have to reroot and establish a parallel basis of operation that covers the primitive operations you're familiar with. Djikstra might be right. There's something liberating to staying in the realm of the formal and mathematical. His detractor's were also right. He is so damn far above everyone else, that everybody in the room has trouble understanding just what it is he's going on about. At the end of the day, teach what the greatest number of the people there can firmly mentally grip, and pass that on. The geniuses like Djikstra will quickly outgrow it, and excel. They don't need the help. Everyone else on the other hand, does. I wouldn't be opposed, to trying Djikstra's approach myself. Shattering my current understanding of the practice of programming and working more from a formal methods POV. That comes after a career which has been fruitful, and was rooted in the old way which worked quite well for many others educated at the same time I was. I already know I can do it. His method just changes the emphasis. Though I will note with alarm, his reticence to test is disturbing. If he does assume everything is proofable from the get go, then I suppose you don't need tests; but that's hardly the way anything in the world actually bloody works. That's Math in a vacuum, with spherical cows. Not writing code then realizing "Shit, the processor in the machine I'm writing for doesn't support that primitive, or has a glitchy implementation thereof".

Software engineering isn't programming for people who can't; it's a set of practices and know-how to navigate a niche field that are battle hardened, and tested through time to actually guarantee some semblance of a chance of success in a field shaped by such fast development, the logic of 6 months ago seems antiquated. For that time with Moore's Law in full swing, yeah. Radical novelty might have been justifiable; but ultimately didn't push past the test of time. It can be as clever a hack as you can imagine, but if no one else can follow it... You haven't condensed it to a teachable form.



> IDK what Dijkstra believed in terms of how programmers should have looked like,

https://news.ycombinator.com/item?id=47373080


I think the "and" is for "stopped (both) A and B"

Yes, that's what they're pointing out. Changing multiple variables at once means you can't attribute changes to any one of those variables in particular.

I knew there must be some good news today

Sorry, it's methanol that will make you blind or directly kill you. (Methanol is smaller than ethanol, so it's easier to build it in not-live chemical reactions.) Try again tomorrow.

They do have nice pictures

I call these "romantic definitions" or "gesticulations". For private use (personal or even internal to teams) they can be great placeholders, assuming the goal is to refine vocabulary.

These charters are as useful as new year resolutions.

Fun times. Coolers, paste, fans, supply watts, dip switches and jumpers. Quake, Voodoo 3dfx vs NVidia GForce. This is where it all started, kids.

I was in high school and was running a "computer games club" (~ Internet cafe for games and kids) since 1998 when we got a place, renovated it ourselves, got custom built furniture (cheap narrow desks) and initially 6 computers - AMDs at 300Mhz. By 2000 we broke a wall in the adjacent space and had ~15, cable + satellite internet for downloads and whatever video cards we could buy or scrap. It was wild.


Finding high school kids with a similar "tech" background today seems really hard. Tech users, sure, chronic phone / game addicts are everywhere, but that tweaker spirit is rare

> This is where it all started, kids.

Nah.. Cassettes, computers-in-a-keyboard, booting straight into BASIC.. THIS is where it all started, grandkids.


Hahaha! True. I was pointing out the beginnings of modern GPUs and NVidia, but yes heard the cassette screeching before the modem screeching indeed.

I think it depends what you're building. I find it fun, once in a while, an engineer to "not go shoeless" and get some of things I need done.

Good luck! Fintechs targeting SMBs is a go-to-market strategy template that makes sense until you go to market and realize that if you have a better product, there's a better, bigger market and that market is the mid-market...

The thing with startups, like with SMBs is that most times are fragile, not-financially sound institutions. At least for startups, those that don't die, usually grow and need the larger scale features anyways.


Thanks! In general we optimize for simple UX and would rather connect to your banking app than replace it. That does help keep feature demand down. But our goal is to grow along with our customers, communicate closely with them, and add the features they need as they scale.

You can probably go the other way and target consumers no? Or are they not equidistant?

Consumers are usually a whole different beast. Everything is different with consumers from sales and marketing to regulation, particularly in the financial sector. I don't see how it could be a natural move, especially not with a treasury product.

There are exceptions, though, like Mercury, which expanded to consumers after having success with their business banking product.


Most banks target both consumers and businesses right? I have some ideas as to why, but it's an observable fact.

Definitely relatable across many markets

Interesting.

Are you generating revenue or, otherwise, what productivity are you measuring?

Without generating revenue (which to be clear is a very good proxy to measure impact) everyone can be indeed very prolific in their hobbies. But labor market is about making money for a living and unless you can directly impact your day-to-day needs from your work, it can't be called productive.


Very valid point. I will lay down the facts for you:

At my previous employer, I was generating $2.5million per year (revenue per employee). I didn't ship a single line of code. All the time was spent trying to convince various stake holders.

Now, I have already built a couple of apps that help me better manage my tech news (keeps me sane) plus I am writing a blog that generates $0. It's only been a month.

If you measure the immediate dollar value, you are right. But in life, pay-offs are not always realized immediately. Just my opinion anyway.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: