mintflow

This is cool, let aside the token usage, perhaps it can help analyze tcp throughput by redirect wire shark/to dump result

show comments
fl7305

Do some people still claim "LLMs are just dumb auto completers"?

Because this seems to disprove that claim pretty convincingly?

show comments
ShinyLeftPad

How quickly claude responds when it acts like a user space LLM chatbot?

fouc

think about how much faster it would've been with a small local model!

twoodfin

Modulo Anthropic messing with the model for load mitigation, I wonder how stable this result is.

1,000 pings, how many correctly ponged?

ForHackernews

>Fun? Oh yeah!

I think this author and I have different definitions of fun.

ValdikSS

That's why LLM will eventually be used only for initial interaction between the user in their language, to prepare the data to a specialized model.

Imagine face recognition to work like a text chat, where the PC gets the frame from the camera and writes in the chat: "Who's that? Here's the RGB888 image in hex: ...".

show comments
bot403

Now do the equivalent of just in time compilation. Claude sees that we need to respond to a lot of pings and writes a program to compute it instead of thinking about each one.

self_awareness

If you wonder why your Copilot subscription has new limits that you hit every few days, it's because of PhDs like Adam.

Could Adam use a local model hosted on his own box? Probably yes. But he preferred to waste the service we all use just to produce a weak blog post that introduces absolutely no knowledge and serves no other purpose than to tell everyone that the author likes to waste resources and calls it "fun".

> Ridiculous? Yes. Wasteful of tokens? Sure. Fun? Oh yeah!

Do you really think it's fun to be one of these people who are the reason why the rest of us gets more limits?

show comments
westurner

Wouldn't this be faster with an agent skill that has code?

/skill-creator [or /create-skill] Write an agent skill with code script(s) that use an existing user space IP library that works with your agent runtime, to [...]

ComposioHQ/awesome-claude-skills: https://github.com/ComposioHQ/awesome-claude-skills

anthopics/skills//skill-creator/SKILL.md: https://github.com/anthropics/skills/blob/main/skills/skill-...

/.agents/skills/skill-name/SKILL.md, scripts/{script_name.py,__init__.py}

https://agentskills.io/what-are-skills

show comments
brcmthrowaway

Next up: Claude replacement to handle simdjson processing.

jeremyjh

Perhaps one day, all network services will be provided by LLMs natively. Truly, that would be a day in the future.

show comments