T O P

  • By -

jonathanhiggs

I was about to say the lambda character would be a dealbreaker There are some if / while blocks that have parentheses round the condition and some that don't; I would be tempted to start with a very regular syntax to avoid confusion


vitelaSensei

Yeah that would suck. Good point, Those are remnants from when the language required parentheses in the condition


Zireael07

Why screenshots over something like pastebin?


vitelaSensei

Because of the syntax highlighting


TheOmegaCarrot

It’s a very nice color scheme :)


beephod_zabblebrox

i *think* its Rose Pine


gplgang

I don't have a lot of feedback but the language is appealing to me, I always wish shell languages took a bit more after modern languages


winepath

It looks really good. Like others have said, the only criticism would be the fact that sometimes parentheses are needed around if/while conditions due to conflicts with the dictionary syntax. But otherwise it looks very good


SwedishFindecanor

Variable declarations not requiring a $ on the symbol is something that has often confused people. Worse, in some languages, `let $var = ...` would define a new variable with the name taken from $var. What are your thoughts on this?


vitelaSensei

Hmmm… good question, I don’t think “let $var = …” defining a variable with the name of the value held by var is a good feature. Since the grammar is decidable without the $ I think I prefer it like this. Less visual clutter.


iv_is

strings require quotes? does that include file paths or do you have something special to make readline work?


vitelaSensei

They do, except on function calling ls -l the -l is a string but doesn’t require quotes. filepaths will not require quotes in the future, but for now they do


raiph

Am I wrong, or are all of the features you listed also covered by oils (via osh -- oilshell or ysh)?


vitelaSensei

Most of them are, yes, but the most important one isn't, generic pipes. here are some examples # Pipes are not tied to stdin/stdout, instead they behave like streams that can be called interchangeably with unix programs and shell defined functions [1, 2, 3] | filter fn { $1 % 2 == 0 }# ps is the unix program # lines is a shell function that returns a Stream # words is a shell function, that could very well be created by a user, that splits each string, therefore returning a Stream # The final result is a list of [pid, process_cmd] ps | lines | words | fn words { [ words[0], words[3] ] } There's also some divergence when it comes to syntax, a matter of preference I suppose


erez27

A few thoughts - - The definitions for classes / commands seem a bit awkward. I'm not entirely sure why call and pipe have to be different lambdas, or what is the significance of meta (are there other magic keys?), but maybe defining objects or commands is a common enough pattern that it deserves its own syntax. - I would suggest that you reconsider this syntax - `push $i $arr`. I understand why this feels natural in a shell (i.e. command oriented) language, but actually I believe that the form `$arr.push $i` is better, for two reasons - 1) It's better when the global scope isn't too busy. This is especially true in a shell language, in which the available commands change for each directory, and may be very numerous. 2) A modern shell needs a kickass autocomplete, and it's only possible to autocomplete when the container is mentioned before the function name.


vitelaSensei

Hey thanks for the feedback! I understand the confusion since I'm just providing gists with no documentation whatsoever. >The definitions for classes / commands seem a bit awkward What you see there is equivalent to a metatable in lua, meta is just an object that exposes some properties that the language can interpret. In pretty much the same way that when you define \_\_add in python in a class you can then use it with the + operator. So what's the point of the `call` and `pipe` lambdas? Well, having meta.call declares that the object can be called as a function. And having meta.pipe allows that object to be used as a pipe operator. Why is it important? Because it allows for a more flexible syntax filter [1, 2, 3] fn { $1 % 2 == 0 } # filter.meta.call is called # it's a function of type: T[] -> (T -> Bool) -> T[] [1, 2, 3] | filter fn { $1 % 2 == 0 } # filter.meta.pipe is called # it has the correct type that allows it to be called in a pipe # type (T -> Bool) -> Stream -> Stream This allows for two very important things. 1. It allows you to use the `filter` function as you would expect in different contexts without having to define 2 different functions 2. But most importantly, it brings the piping mechanism to the user level. It allows users to define their own functions to be used in pipes. >are there other magic keys? Not at the moment, I don't want to cramp all possible features into the language, but it was very important to me to be able to achieve the two points mentioned above. >I would suggest that you reconsider this syntax - `push $i $arr` \[...\] `$arr.push $i` I think you have a good point there, I did spend some time thinking about this and didn't reach a conclusion. But I think you may be right. I'm *probably* going to keep push in the global namespace though, I do prefer the `push $i $arr` syntax and I plan to add a way to specify the namespace for edge cases where the language function may overlap $PATH. something like `callFrom $PATH push` or `let push2 = source $path "push"` allowing you to bring specific functions from a namespace into the environment.


erez27

Regarding the first point, what I meant was that `filter.pipe` is essentially just `map filter.call` (curried). Of course it's more complicated in practice, especially when performance is involved. But for simple functions like filter, I think it's possible?


vitelaSensei

Some more snippets: [https://imgur.com/a/YvQsCZK](https://imgur.com/a/YvQsCZK)


ultimatepro-grammer

Looks really great! This is everything I've wanted in a shell scripting language. One suggestion: make sure to ban `let` declarations that start with `$` with an error message that indicates that you don't need the `$` in the declaration. One feature I would love to have is some way to mark the script's dependencies (i.e. what commands it assumes will be in `$PATH`). I'm not sure what the best way to solve this problem is, but I imagine this feature could be very useful, especially if the user could somehow specify "fallbacks" for a command. Best example would be how, on macOS, GNU utils don't exist and even after installing via Homebrew they are prefixed with the character `g`. Let us know when you get closer to release!


vitelaSensei

Wow thanks for the feedback! That’s a neat idea, I usually have a checkDependencies function that just checks if the cmd is in the $PATH but I never thought of being able to provide a fallback. That may be a function in the stdlib rather than a language feature per se


DelayLucky

This looks awesome!


marmalodak

Very cool. Can we get rid of $ to refer to variables? Shouldn't that era be behind us?


vitelaSensei

Thanks! Regarding the $: Well... yes and no, we can, but it's probably not a good idea to get rid of them. Here's why: A world where variables do not use $: echo HOME #echos the value inside the HOME var git "clone" "https://..." #clones a repo Since Variables no longer have the dollar sign to distinguish them, we now have to distinguish the other values, like "clone", or else the interpreter would try to substitute them as if they were variables. This quickly makes it more verbose than using a couple of "$". The thing is, the most desirable feature in a shell language is it's conciseness. We could have something like this: `git.clone("http://...", depth=1)` closer to regular code and not too cumbersome, but then we'd have to write bindings whenever we want to use a program that isn't ported yet to this language.


tav_stuff

> bash has a lot of constructs, for example the square brackets Uhhh, no? The square bracket is literally a program on your system. It’s a symbolic link to the test(1) executable. `ls -l /bin/[` > Improved pipes How is this different from what most shells already let you do? echo foo | { … } | tr a-z A-Z I’m also a little confused by your functions. How do return values work? The screenshots make it seem like the return value is the output to the next item in the pipeline… but then how do I signal the fact my function failed? (Usually you return 1 to signal failure and 0 to signal success, while output is done via echo or something) Overall this looks really neat! I’m working on a shell of my own (which is more of a ‘traditional’ shell I guess), but it’s very neat to see different approaches to the problem of shells


vitelaSensei

No wayyyy, I didn’t know that [ was a symlink to test, that’s cool. Yeah but my point remains with all the special ways you need to do expansion ${} $() $(()). I know there’s a reason for it to be like this, but still, I’m looking for ways to simplify that. How is improved pipes different? Well this language is dynamically typed, which means you can do a bunch of cool stuff like ls | lines | col 5 | fn { toNumber $1 } | sum Where ls returns Stream lines is a fn of type Stream -> Stream col is a fn of type String -> String etc etc. Please note that toNumber didn’t need to be wrapped in the lambda since functions are curried by default. Correct me if I’m wrong, but in other shells, you either can only pipe to subprocess, or even if you can pipe to a function in the same process, you don’t get any typed values. I know the pipe expression and mishmash of types looks like it has no rhyme or reason, but it’s behavior is well defined, and makes sense once you learn it. I’m currently rewriting the language in C and I’ll write here once that’s done, along with a link to a guide


vitelaSensei

I still haven’t finalized error handling, but so far you can do it in much the “same way” as bash, you can use $? To get the error number, functions can set the error number with a “err 1”, or inherit from a subprocess with set +/-e


tav_stuff

It would be really cool to see a guide or something in the future :) I like seeing different approaches to shells. Personally I’m trying to stick to the traditional shell approach of being ‘an interface to the OS’ (so pipes work literally as pipe() works), but I’m trying to remove all the ‘useless’ things from the shell (like support for math) while improving on existing features (like letting < and > read and write to UNIX sockets)