Basically he wanted home automation in Perl to control his geothermal/solar house, and ended up reimplementing Perl with AI. That's some yak shaving...
I'm interested, but can't navigate the website. The down-arrow in the lower-right is unclickable, maybe covered by some semi-transparent chrome of my browser, not sure. And no idea why there need to be 4 directional arrows.
That's Reveal.js / Slides.com format. It became very popular in 2010s. The idea behind the 2-d navigation is that you can use left-to-right to move between chapters, and move down to dive into a specific chapter. This allows you to skip chapters due to time constraints. Or hide gnarly details about something so that these specific slides do not break the flow of presentation but still having them available for the audience online. Or, having slides announcing demos, but if demos do not work the down slide would have a video demonstrating how the demo is supposed to work. Many possibilities like this. Also the slides are produces using Markdown, so the format was appealing to many authors.
However, doing chapters well turned out to be tricky. Ideally you want them to be of similar size and have 3 to 7 of them in the talk, but many presentations aren't structured like this. The rise of Slideshare and SpeakerDeck for sharing slides in mid 2010s caused this 2-d navigation to go out of favor: those services only support linear static slides. This is also a reason why people use fewer animations in slides nowadays and why tools like Prezi didn't catch on (that was another presentation tool with non-standard navigation that went out of favor very quickly).
Many people still use Reveal.js to make their slides but they stick to left-to-right nav only.
Deep-linking into a reveal slideshow from the presentation - which is meant to be navigated by keyboard arrows ONLY - is suboptimal.
Yes, standing in such a hole is normally not recommended without shoring - safety first - but you do not know the soil specifics. It's like concrete, the excavator digging that hole stood on its very edge after 2 days of rain no problem at all. Disadvantage of such soil is that the percolation rate goes against zero.
Yes, doing a "Perl Interpreter" (it's way more than that) even with the help of the most advanced AI on the planet is PITA. The coding agents do fake, lie or are way out of their depth, but when you are used to limited AI since the 90ies - like me - you know how to handle it. Good News is that in 2 years you will probably tell your AI to create you - just for the laughs - a Perl interpreter AND smarthome system before you go to bed. You will have them ready for breakfast.
As for the maturity of the project, it's really too soon. I thought the German Perl Workshop would be in May, but mixed that up with last years' date, so I presented what I had. In about two months this should be nice(r).
And one final remark: Everyone knows Torvalds for the Linux kernel. Most don't know or ignore he did git too. Here, I presented two things: WHIP and pperl.
WHIP being a smarthome solution way above and beyond what is available on the market today, but that seems to somehow evade peoples minds when they see the slides.
It's a very interesting project (even if I always avoided Perl and 'officially don't care'). And so it sucks you got a mediocre response because dum slideshow UI issues. Maybe write up a blog post and try again later (just make sure its not too chatgpt-ish).
Ugh, deep links should be part of the path, and anchor should be where on the page to scroll. Very annoying slide software. If the content weren't so good I simply wouldn't bother.
HTML+JavaScript-based statically hostable apps (eg. presentations) can't use paths as deep links, since there's no standard for simple static hosting or URL rewriting (even 30 years later). Oh well.
You should be able to use the query part of the URL (after ?). You can get at it with Javascript, but it doesn't influence which static HTML page is served.
This looks like a huge project, even with AI help... I have a sweet spot for perl but I'm honestly not sure if the current community has the bandwidth and interest to sustain an alternative implementation. At the very least it should be ported to MacOS too. Breaking with XS is a bold decision. Best of luck though!!
macOS is the easy part.
XS is the problem, because once you break that bridge, a lot of serious CPAN distros turn into deadweight, with somebody stuck redoing piles of dependecies plus the weird hooks old tooling expects.
If anything kills this project it won't be platform support.
It'll be perl's regex engine, the ancient edge cases around it, and the fact that AI can spit out code that compiles while still missing half the assumptions buried in moduels people still need.
It's a kind of crappy slide deck, not a proper home page. Even worse, the link drops you into the middle of the deck. (TBF, it wouldn't be so bad if you know that it's a slide deck when you load the page.)
Try using the arrow keys to navigate. It took me multiple tries to get it figured out.
Use up/down to navigate within a chapter/topic.
Use left/right to switch between topics.
The project relies on Rayon [1] for scheduling parallel tasks and Cranelift [2] to JIT the hot loops.
There are plenty of other interesting features like auto-FFI, bytecode caching (similar to Python's .pyc files), and "daemonize" mode (similar to mod_perl or FastCGI).
I wonder how long he waited for the CPAN nologin case. I remember requesting a CPAN account 3 years back and it took ~2 months for someone to look at and accept.
Very good, actually. But you have to nudge them slightly. Tell them you prefer the modern version of the language, with gradual typing† and function signatures, and you'll get very good results. Perl interpreter comes standard on modern OSes and due to permissive licensing and impeccable backwards compatibility you can always assume you deal with very modern versions of Perl.
I write Perl scripts that are 10-100 lines of code, and at this size Perl is a Strictly Better Bash: better syntax, some type checking, better text support, and still effortless calls to external processes: essentially you put a command with arguments in backticks, and you get it's output. Ruby can do it too, but not all systems have it. Python is another obvious choice but calling external commands in it is annoying. I also use Perl for some one-liners as a better `sed` for text replacements.
† Perl nowadays have TypeScript-style type checking for function parameters. So, while the syntax is wild sometimes, the language is much better than it used to be.
Assuming every OS ships new Perl is a good way to lose a bet, since RHEL and CentOS are happy to hand you a system package from years ago.
All the gradual typing and signatures in the world do not matter when the interpreter on the target box is old enough to miss half of it, and then you are dragging in CPAN modules or juggling shebangs just to get the same script to run everywhere. Bash at least advertises its limits. Perl can look like a nicer shell tool right up until deployment turns into a version scavenger hunt.
What kind of context has you deploying into old systems that don't ship a recent perl? If that is a legacy requirement for whatever reason, then at least I'd use docker or podman to get a recent runtime. Or would you also write Python 2 or Php 7?
What are you using for parameter type checking? I switched to native function signatures, native try/catch and might look into the new class system soon, but I don't recall native type checking...
5 has this. There are modules that get you to function signatures and type constraints. It's all opt-in and, as was said, you have to nudge LLMs to use it, but they can and the results are indeed better.
What kind of performance impact does it have? Obviously it depends on the specific program, but let's say the worst case scenario, something like a recursive implementation of the factorial function.
"Auto-Parallelization - Automatic parallel map, grep, for, while loops via Rayon work-stealing"
Given any kind of "for" loop, how can it know that there is no synchronization required ? That no mutual exclusion is required ? No concurrent access of some kind ? Offloading some work to another process/thread is expensive, too
If the inner body of the loop is a pure-function, then that's easy (except for the performance part, which may require heuristics or something). But if the body is not pure .. ? I cannot see how this can work reliably with any random code
I had to build a Perl implementation of the Chaskey mac algorithm. ChatGPT spat out a working Perl prototype based on a C file for Arduino. It quite slow with not very much to optimize, so I made it write it with XS. A hour later I have a working XS implementation that compiles and tests cleanly.
So the AutoFFI thing is super interesting. The .plc also.
Basically he wanted home automation in Perl to control his geothermal/solar house, and ended up reimplementing Perl with AI. That's some yak shaving...
Man, I would have just learned Ruby.....
However, doing chapters well turned out to be tricky. Ideally you want them to be of similar size and have 3 to 7 of them in the talk, but many presentations aren't structured like this. The rise of Slideshare and SpeakerDeck for sharing slides in mid 2010s caused this 2-d navigation to go out of favor: those services only support linear static slides. This is also a reason why people use fewer animations in slides nowadays and why tools like Prezi didn't catch on (that was another presentation tool with non-standard navigation that went out of favor very quickly).
Many people still use Reveal.js to make their slides but they stick to left-to-right nav only.
But using keyboard arrow keys work for me.
Project page is: https://perl.petamem.com/
Deep-linking into a reveal slideshow from the presentation - which is meant to be navigated by keyboard arrows ONLY - is suboptimal.
Yes, standing in such a hole is normally not recommended without shoring - safety first - but you do not know the soil specifics. It's like concrete, the excavator digging that hole stood on its very edge after 2 days of rain no problem at all. Disadvantage of such soil is that the percolation rate goes against zero.
Yes, doing a "Perl Interpreter" (it's way more than that) even with the help of the most advanced AI on the planet is PITA. The coding agents do fake, lie or are way out of their depth, but when you are used to limited AI since the 90ies - like me - you know how to handle it. Good News is that in 2 years you will probably tell your AI to create you - just for the laughs - a Perl interpreter AND smarthome system before you go to bed. You will have them ready for breakfast.
As for the maturity of the project, it's really too soon. I thought the German Perl Workshop would be in May, but mixed that up with last years' date, so I presented what I had. In about two months this should be nice(r).
And one final remark: Everyone knows Torvalds for the Linux kernel. Most don't know or ignore he did git too. Here, I presented two things: WHIP and pperl. WHIP being a smarthome solution way above and beyond what is available on the market today, but that seems to somehow evade peoples minds when they see the slides.
In case HN shows its user hostility again by cutting off the URI fragment, the intended deep-link was presentation slide #/4/1/1
They also could use the query part on the url rather than anchor.
Lastly statically hosted doesn't mean no URL rewriting, they could again catch links to parts easily.
The poor UX of these tools is just a lack of will, not a technical limitation.
Then again hacker news should probably not blanket delete the hash in URLs either.
If anything kills this project it won't be platform support. It'll be perl's regex engine, the ancient edge cases around it, and the fact that AI can spit out code that compiles while still missing half the assumptions buried in moduels people still need.
The parser is also quite good by now. Perl5 had to bootstrap pperl with parsing in the beginning, but not anymore.
The AI tar pits are real. "We'll use the regex crate - it will be fine." And me being like: "No."
OTOH, the AI kept telling me that I'm absolutely right, so it must be good. ;-)
https://metacpan.org/pod/Coro
https://metacpan.org/pod/MCE
https://metacpan.org/pod/OpenMP
https://metacpan.org/dist/UV
I can't help but giggle at the fact that AI written project doesn't seem to get its home page right.
Try using the arrow keys to navigate. It took me multiple tries to get it figured out.
Use up/down to navigate within a chapter/topic. Use left/right to switch between topics.
There are plenty of other interesting features like auto-FFI, bytecode caching (similar to Python's .pyc files), and "daemonize" mode (similar to mod_perl or FastCGI).
[1] https://docs.rs/rayon/latest/rayon/
[2] https://cranelift.dev
https://perl.petamem.com/docs/eng/petaperl/differences.html
I wonder how long he waited for the CPAN nologin case. I remember requesting a CPAN account 3 years back and it took ~2 months for someone to look at and accept.
I write Perl scripts that are 10-100 lines of code, and at this size Perl is a Strictly Better Bash: better syntax, some type checking, better text support, and still effortless calls to external processes: essentially you put a command with arguments in backticks, and you get it's output. Ruby can do it too, but not all systems have it. Python is another obvious choice but calling external commands in it is annoying. I also use Perl for some one-liners as a better `sed` for text replacements.
† Perl nowadays have TypeScript-style type checking for function parameters. So, while the syntax is wild sometimes, the language is much better than it used to be.
All the gradual typing and signatures in the world do not matter when the interpreter on the target box is old enough to miss half of it, and then you are dragging in CPAN modules or juggling shebangs just to get the same script to run everywhere. Bash at least advertises its limits. Perl can look like a nicer shell tool right up until deployment turns into a version scavenger hunt.
[1]: https://en.wikipedia.org/wiki/GNU_parallel
Given any kind of "for" loop, how can it know that there is no synchronization required ? That no mutual exclusion is required ? No concurrent access of some kind ? Offloading some work to another process/thread is expensive, too
If the inner body of the loop is a pure-function, then that's easy (except for the performance part, which may require heuristics or something). But if the body is not pure .. ? I cannot see how this can work reliably with any random code
I had to build a Perl implementation of the Chaskey mac algorithm. ChatGPT spat out a working Perl prototype based on a C file for Arduino. It quite slow with not very much to optimize, so I made it write it with XS. A hour later I have a working XS implementation that compiles and tests cleanly.
So the AutoFFI thing is super interesting. The .plc also.
The autoffi thing is nothing new, I did that with cperl a decade ago. Added native types also, which he doesnt have yet.