Hacker Newsnew | past | comments | ask | show | jobs | submit | greatgib's commentslogin

The basis of httpx is not very good at all.

I think that it owes its success to be first "port" of python requests to support async, that was a strong need.

But otherwise it is bad: API is not that great, performance is not that great, tweaking is not that great, and the maintainer mindset is not that great also. For the last point, few points were referenced in the article, but it can easily put your production project to suddenly break in a bad way without valid reason.

Without being perfect, I would advise everyone to switch to Aiohttp.


I literally the other week had the choice between using requests and httpx. I chose httpx after deliberating a bit. I don't need async capabilities right now but I figured it'll be more consistent if that changes later.

I started using the ports and adapters pattern and protocol for any packages that have replacements or concerns.

Basically treating HTTP requests as an orthogonal, or cross-cutting concern.

It is sometimes hard to tell if these upstream packages are stable or abandoned.

I should probably document my methodology so it can help others or at least have the chance to find out what mistakes or limitations they might have.


aiohttp is an excellent library. very stable. I concurs, but! it's too heavily tied to HTTP/1, and well, I am not a fan of opening thousands of TCP conn just to keep up with HTTP/2 onward. niquests easily beat aiohttp just using 10 conn and crush httpx see https://gist.github.com/Ousret/9e99b07e66eec48ccea5811775ec1...

fwiw, HTTP/2 is twelve years old, just saying.


aiohttp is for asynchronous contexts only

I don't that their organisation even know how to do things well. It's not in their DNA to not fuckup their users.

But that being said, I have a good laugh at their announcement because you know they will spend money to try to make the thing nice, everything they can at their own cost, to be able to win the users back and lock them, and then they will start to fuck them up again once they feel confident enough.


Doesn't make sense because AI responses are not grounded, still for AI to make sense in this context, and have any relation with Kagi purpose, you need to have the search still, and then the AI process the search results.

Common facts like “what is the capital of Hungary” are repeated so many times in the training data that the LLM knows them without a search.

I agree with you for a general assistant but even if I'm also not interested to pay for an assistant there are 2 features that I like and bring a lot of value by default in my opinion: - if you put an interrogation point at the end of the query you have an AI reply based on the search query. - you can ask a question about that to investigate more.

Tesla was not initially created by Musk: https://www.greencarreports.com/news/1131215_tesla-existed-b...

So, the initial good direction my have been despite him, and still successful mostly thanks to the big load of money he brought him instead.


Such a joke to advertise Claude as a tool to work on corporate technical debt when it is definitively the thing that will increase it a lot.

And let's not even discuss the vacuity of their new cash machine certifications. "Architect" come on...


We're 6 months away from some company's app/infrastructure/whatever going down and staying down, because literally nobody knows how the 500,000 line code base works and Claude is stuck in a loop.

Lol, just press escape then tell it to roll back to the last stable release.

Right, because the same people who vibe coded their applications are the same people who take the time to setup robust infrastructure to allow for easy roll backs.

LLMs are good for documenting specific things.

E.g., "find where the method X is called and what arguments are passed".

That can be useful for refactoring or debugging.

Coding is the worst way to use an LLM though.


Shhh...you're only supposed to unilaterally praise it to get along with your clueless leadership.

The same is true for every other strategy to avoid technical debt.

It is bullshit all the way down.


They should at least restricted it to IPv6. Here it will be a kill for everyone using mobile network and 5g hotspots.

Can you receive inbound connections on your hotspot?

At least in France and I think in a lot of other countries, you still get a dedicated IP for your connection, so yes you could receive inbound traffic.

Just the IP will most of the time be dynamic, and you might have your IP changing regularly.


If we need to sum up the state of gov now I would pick the following quote:

   He told another colleague, who refused to help him upload the data because of legal concerns, that he expected to receive a presidential pardon if his actions were deemed to be illegal, according to the complaint.


   The votes made up less than 4 percent of those cast in Basel-Stadt and would not have changed any results
I like the concept of "your vote was useless anyway".


I also find it super more complicated and messy than what you can find in another language without proper justification.

Like the Temporal.Instant with the only difference that is now but in nanosecond. Would have been better to be Now with a suffix to indicate that it is more precise. Or even better, it is just the function you use or parameter that give the precision.

And why Now as the name space? I would expect the opposite, like python, you have something like Temporal.Date, and from there you get a date of now or a specific time, with or without timezone info, ...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: