Lightbug HTTP: Mojo web framework
GitHub
GitHub - saviorand/lightbug_http: Simple and fast HTTP framework fo...
Simple and fast HTTP framework for Mojo! ๐ฅ. Contribute to saviorand/lightbug_http development by creating an account on GitHub.
54 Replies
by @a2svior
hi, I have been experimenting with Lightbug as a serving interface for https://github.com/alainrollejr/mocodes
GitHub
GitHub - alainrollejr/mocodes: Error Correction (De)Coding with Mojo
Error Correction (De)Coding with Mojo. Contribute to alainrollejr/mocodes development by creating an account on GitHub.
A couple of questions which I don't know are worth filing change requests for. First one is I actually needed HttpService's func to be allowed to change self (essentially to keep state over incoming requests). For now solved that through pointers but would be easier if I could just change some self.counter value from func.
Second one, my test client sends POST messages of a certain length L (smaller than 4096 bytes). Is there a way for the serving handler only to be called when the full L bytes have been received on the socket ? Mostly that is the case but random occurences happen whereby func() is called on body with unexpected length way smaller than L (requiring me to keep even more local state).
Last, my application actually does numerical work on Int8 bytes. So I think if there would be a way that Bytes() could actually be a DTypePointer into Int8 rather than a List of Int8, might be beneficial for my application even though List has this scary "steal_data" functionality
@a2svior fyi ^
@Jack Clayton Thanks a lot for tagging!
@alain great questions, I think these are definitely worth making issues for. I'll create some issues today and link here. Then let's continue the discussion on GitHub if that works ๐
@a2svior you may both be interested in the Bytes implantation in Basalt, in the utils folder
Ah perfect, thanks a lot for the rec!!
@alain made these three issues, tried to rephrase them based on my understanding. Let me know if I made mistakes somewhere
1. HTTPService's Self should be mutable to maintain state over incoming requests
2. Request handler is called on a body with unexpected length
3. Bytes should be a DTypePointer instead of a List
@alain how urgent/critical are these? Are you still able to achieve your goals with Lightbug without them? Thinking which ones I should prioritize
that's awesome @a2svior. Descriptions are capturing the ideas well. (2) is the most critical and urgent one as it actually stops me from being able to use Mocodes in my application and I don't have enough HTTP/sockets knowledge myself to go fix it. Having said that, I took a stab at (1) myself by simply changing the HTTPService trait func(self, ...) to func(inout self,...) but then ran into issues with rValue and mutability in server.mojo which I did not know how to circumvent.
Got it. Can you maybe attach a code snippet inside the issue #2 with the service code/request payload you're using so I can try and reproduce the problem on my end?
#1 should be fixable, I'll look into it
Done !
Thank you, will see if I can fix this today. I'll keep you posted
@alain couldn't reproduce yet, asked a couple questions in the issue. If you can let me know the OS you're using that would also help ๐
@alain (2) should be good now as long as a Content-Length header is set (which is the case in your code). Will try (1) next
hi @a2svior I confirm (2) is fixed also on my system with those fixes you made. Many thanks !!
Hello @a2svior , me again ๐ So I have finally found some time to do speed benchmarking with the simplest possible lightbug server that mimicks the behaviour I ultimately need from my lightbug version of the decoder This simple lightbug server just sends back the received packet wrapped in an OK() message. I have a test client that sends packets in a loop to this server, and it measures the elapsed time. What I found is as I increase the size of the packet, with a lightbug server the packet rate decreases rapidly whereas with a flask or fastapi python server that does the same job, the packet rate decreases much less. Again I don't want to just go ahead and create a GIT issue for this as I realise that large binary HTTP packets (up to several 100kByte) were probably not what you had in mind for lightbug. Please advise.
@alain in general, performance definitely needs improvement. I suspect that in your particular case this has to do with the fact that there is currently conversion to strings and back going on in various places. I'm actually removing these conversions by default, which should also make it easier to serve binary files like images, see this PR https://github.com/saviorand/lightbug_http/pull/43 . I haven't benchmarked it yet though, maybe we can try it out with your test code?
In this (general) performance improvement issue someone also posted nice flame graphs from
perf
for Lightbug, those were running an older Lightbug version though https://github.com/saviorand/lightbug_http/issues/6 .
Definitely interested to try and improve performance for your case. I'm a newbie in the performance world though, would appreciate any tips/suggestionsGitHub
Performance Improvements ยท Issue #6 ยท saviorand/lightbug_http
Parallelization and performance optimizations
Another issue is the practical absence of async (since there's no runtime for async functions) in Mojo, this is also a drag on performance.
But in your case I think as long as we remove redundant conversions and re-assignments of variables and boil it down to a minimum Mojo layer + external C calls to socket apis we should be able to get close to flask (or better) in terms of performance
sounds good. I am happy to post my test client code eg in a github issue that exists or one that you create for it, let me know
Created an issue here, can you share the code? So I can try on my end. Thanks a lot, really appreciate this info
GitHub
Significant slowdown with large payloads ยท Issue #45 ยท saviorand/li...
Issue raised by the author of https://github.com/alainrollejr/mocodes Currently, the performance rapidly deteriorates with the increase in packet size and request/response payloads. Compared to Pyt...
@alain I also saw in your repo gRPC support is something that could be interesting? I might create an issue for that as well
I would LOVE grpc support over plain HTTP every day, if that is within your roadmap ideas that would be fantastic.
The way I see it the biggest impediment to Mojo adoption is that it is not taking care of IO. Wonderful if you can demonstrate a fast algorithm on a file or data in RAM. Now how do we get that into a microservices oriented ecosystem because no, we don't all want to go rewrite all of the existing code in C, C++ etc in pure mojo. Real programs act on real data coming from the external world and send the result back to the real world, preferably at the same whopping speed of the algorithm itself.
Yes, I'm definitely interested in gRPC support as well. Haven't thought through the logistics yet. Maybe a separate library makes more sense, like
lightbug_grpc
. To keep things separate@a2svior Instead of directly building it as a gRPC-specific library, It will be better if we structure it as a general RPC (Remote Procedure Call) library. The gRPC implementation can then be built on top of this general RPC framework. Also another like Capโn Proto can be built on top of general rpc.
@a2svior I updated the ticket with test client and server code and corresponding results obtained on my machine. Happy hunting !
@alain thanks! Will give it a go and keep you posted if I can improve something
@NobinKhan good point, let me know if you wanna collaborate on this, we can coordinate over DMs
Thank you so much. I am interested. It will be a great opportunity for me to learning new things and gain some skills.
hi @a2svior I was wondering whether you have made any progress on the speedup exercise for large message sizes ? Are you close to a solution ? If not, I may have to fork and hack/slash myself in order to meet my project deadline
We managed to achieve better performance on this PR https://github.com/saviorand/lightbug_http/pull/50, @toasty helped a lot with that. But now I'm getting some issues with running your test, I think I need to refactor request processing logic. If you can try pulling this and have any ideas on how to debug the errors we're getting would really help!
The only thing is today's hightly of Mojo broke some things again, if you can maybe use yesterday's nightly version
Let me know if you encounter any issues running this
It's still slow on the largest payload in your test , but I have an idea on how to fix that. Will try today
Hi @a2svior I myself stay on stable builds rather than nightly to avoid unexpected time lost any given day. Eg now I am on v24.3.0. Are you saying your fixes/improvements won't be compatible with that Mojo version ?
Tested this with 24.4, should work. But I'm still debugging to make your test work, right now breaks after some point. If you have any idea on how to fix let me know https://github.com/saviorand/lightbug_http/pull/50
okay @a2svior as soon as I have upgraded to 24.4 I will give it a go !
Congrats @alain, you just advanced to level 5!
I've just given it a try but it seems to crash real quick with my test client indeed. One thing I do notice is that while I am sending "Content-Type': 'application/octet-stream'", the lightbug server reports: "Content-Type: text/plain" and also while I send 1000 byte packet body lightbug server reports "Content-Length: 998". Maybe this is stuff I should comment in the pull request ticket
Yup, thanks. Let's continue there!
Lightbug has reached 420 stars on Gitihub ๐ ๐ฌ ๐ฅ
Hello,
SSE(server sent event) can be an easy to use alternative to websockets until it gets implemented with async await:
https://www.pubnub.com/guides/server-sent-events/
It basically upgrade an http get request into a single-way stream.
The server first set the header to "text/event-stream.", then the socket is stored into an array.
Whenever the server need to send data, it just write to the socket ๐
On the browser-side, a simple js callback is set for whenever an event is received.
This is particularly useful for 2D games in the browser, but also for your AI ui's !
The advantage is that it has way less complexity than websockets.
If there are 10 visitors, it is as easy as looping an array and sending data (usually json).
Because the socket stays open, it can be used as an session too.
(ping to @a2svior)
This is great, thanks a lot! Actually this might be a good idea to implement first before Websockets
There is a great test suite to test websocket implementations:
https://github.com/crossbario/autobahn-testsuite/
The list of projects and users that used it is impressive ๐
(ping to @a2svior)
GitHub
GitHub - crossbario/autobahn-testsuite: Autobahn WebSocket protocol...
Autobahn WebSocket protocol testsuite. Contribute to crossbario/autobahn-testsuite development by creating an account on GitHub.
Looks cool, will check it out. By the way, if you're interested in contributing to a SSE/websocket implementation let me know, happy to discuss!
Working on it ๐ฅ ! already have the connection upgrade to
Would you mind if i PR a
websocket
and the ability to receive messages of all sizes !
It is quite difficult to implement all the features (fragments) and many things could raise.
Let's focus on an example that can do receive and send.
We might need to get on an audio conference and adapt it to lightbug ๐
Documentation:
https://developer.mozilla.org/en-US/docs/Web/API/WebSockets_API/Writing_WebSocket_servers#the_websocket_handshake
Would you mind if i PR a
websocket.mojo
to lightbug in a work_in_progress folder?
(That way you can scavage the example and integrate to lightbug)Definitely!! Also happy to get on an audio conference. Maybe let's do the PR first, I'll try my best to integrate, and then we can coordinate/talk to resolve any remaining questions
Lightbug 0.1.3 Release just dropped!
Featuring performance and stability improvements and a new installation workflow -- you can now add Lightbug as a dependency to your
mojoproject.toml
and import Lightbug with Magic to use in your projects.
Check out the README for details on how to try it out:
https://github.com/saviorand/lightbug_http
Lightbug 0.1.4 Release just dropped!
Headers are much more ergonomic in Lightbug 0.1.4 thanks to @bgreni 's contribution!
There are now three options for specifying the headers that are accepted as input to HTTPRequest
or HTTPResponse
:
1. Assigning to headers directly:
2. Passing one or more instances of the Header
struct to Headers
:
3. Using the parse_raw
method on Headers
:
The headers can then be accessed as header["Content-Type"], "text/html"
The codebase is also much more Pythonic now with refactors from @bgreni , with more use of dunder methods and direct string operations.Congrats @a2svior, you just advanced to level 7!
Do you have some benchmarks?
@Peter Homola yup, some of the latest ones were posted by @bgreni here https://github.com/saviorand/lightbug_http/pull/61#issuecomment-2362104634
GitHub
Refactor header parsing and data structure by bgreni ยท Pull Request...
@saviorand I figured I might post a draft PR for this so I can see what you think about my approach before I get too much deeper into applying the changes and doing more thorough unit testing
Main ...
FYI I was running it on an M3 chip
Do you plan on having some sort of templates? I wrote a PoC, can be seen here: https://arax.ee/mojo Iโd be interested in having something similar to Go.
Do you mean for HTML specifically or something general-purpose like https://pkg.go.dev/text/template ?
For HTML I have some future plans but this will be in a separate library called
lightbug_web
that will build on lightbug_http
HTML given the context.
I'm starting to implement a mojo-websockets package. Totally WIP but my intention is to roughly conform the python-websockets one, first starting with the sync version, and later the async one
@msaelices we also have an open PR with @rd4com to add websockets to lightbug, in case you'd like to take a look: https://github.com/saviorand/lightbug_http/pull/57
wow! I did not know that PR! Will take a look. Thanks!
Lightbug 0.1.5 Release just dropped!
The most important contribution is by @bgreni - the HTTP client now follows redirects.
We've also reorganized the code quite a bit and removed the client and server implementations that were calling into Python for socket interactions.
It's all Mojo now :mojo:
https://github.com/saviorand/lightbug_http/releases/tag/v0.1.5
GitHub
Release v0.1.5 ยท saviorand/lightbug_http
What's Changed
Fix import statement in docs by @cosenal in #63
Follow redirect responses by @bgreni in #60
Use Mojo test runner by @bgreni in #64
Split http module into multiple files by @bgre...
Great job with the speed improvements! @a2svior @bgreni