Announcement
Collapse
No announcement yet.
Ada++ Wants To Make The Ada Programming Language More Accessible
Collapse
X
-
Originally posted by mb_q View Post
I don't know if this is a true origin, but my personal theory is that it came from LLVM IR, which has i<n> symbol for n-bit integer, and Rust basically compiles to it. Though, there is no u<n> since there are operations that are signed or unsigned there. Also floats are called as in C, float, double, etc., not f<N>.
Maybe there isn't really an origin at all, due to convergent evolution of ideas (really not that hard to think of u<n> as unsigned fix-bit types)
- Likes 1
Comment
-
Originally posted by zxy_thf View Post
C99 has int32_t uin32_t int64_t etc.
I am still waiting that c and c++ 's 2020's standard became available and all new compiler sets them as default. But it looks like it will take about 10 more years to see that
Comment
-
-
Originally posted by polarathene View Post
What benefits does Ada have over Rust? Asking as a dev with experience writing Rust code but not that familiar with Ada which has apparently been around for decades. I assume the developer experience isn't as good, but does Ada offer any advantages over Rust?
type DayNames is array(DayOfTheWeek) of String
Rust currently doesn't have anything like that unless you take the macro route.
Ada is a strictly imperative language where Rust is half-way between imperative and functional. Error handling in Ada is done using exceptions unlike Rust's pseudomonads, with all the pros and cons of the respective approaches. Ada doesn't have anything like Rust's lifetime system, instead it has application-specific memory allocation pools. Memory leaks are possible. Rust's philosophy is to require no runtime system (although that's no longer true with async/await). Ada relies on a runtime system which, in some cases, is a self-contained minimal OS.
Neither Ada nor Rust is explicitly a full-fledged object-oriented language with implementation inheritance and all, but in both cases the type system can be abused to re-create it. Interestingly both end up doing it the same way, that is whether a call is dispatched statically or dynamically is not a property of the method (like in C++, Java etc.) but of the object reference.
- Likes 4
Comment
-
Originally posted by pkunk View Post
It is actually dangerous, hard to understand and weird:
The analyzer detected an expression leading to undefined behavior. A variable is used several times between two sequence points while its value is changing. We cannot predict the result of such an...
https://github.com/apple/swift-evolution/blob/master/proposals/0004-remove-pre-post-inc-decrement.mdThis maintains proposals for changes and user-visible enhancements to the Swift Programming Language. - apple/swift-evolution
No, these operators are not dangerous.
Dangerous are the programmers who are not aware that in many programming languages, especially in any programming language where performance matters, there are many cases where the order of evaluation of some expressions is undefined, therefore any program that depends on the order of evaluation in those cases is illegal.
There are many languages where the order of evaluation of the arguments passed to a function is undefined, and this is a good feature.
So, according to your opinion, function calls are a dangerous feature, because some programmers will not bother to learn that in such languages they must write only function calls that do not depend on the order of evaluation of arguments (which is recommended as good style even in the languages which define an order, e.g. left-to-right).
I believe that knowing when the order of evaluation is defined and when the order is undefined and writing the code accordingly are essential requirements for any decent programmers and removing from a language a few supposedly hard-to-learn operators will not succeed to save them from the need to understand these facts.
- Likes 1
Comment
-
Most of Ada++ changes are just abbreviations and syntactic sugar that I wished myself when I started programming in Ada with my background as a C developer, but passed several days of coding I just did not felt that it was needed anymore (this had more to do with the "resistance to change").
I fail to see the benefit of replacing Integer and Long_Integer with Int_32 and Int_64.
First, if I want an Integer, in my IDE I will type "Int" and use auto-completion.
Now with Int_32 and Int_64, the auto-completion will systematically block after the "_" and I will have to chose every time between 32 and 64.
Most of the time, a simple 32 bit integer is enough when using those types, so this shortcut brings a new burden...
But moreover, this looks like someone wanting to program in Ada like you would do in C.
In Ada, you have advanced typing features, especially for defining precise types with ranges and binary representation.
So in Ada, when I use and Integer, this means I just want any Integer (I don't care about bounds or binary representation).
If my data doesn't fit in an Integer, I may use a Long_Integer. But never I would want to rely on the fact that and Integer is a 32 bit Integer and a Long_Integer is a 64 bit integer. If type's size is important, then I will define a specific type for that ("type Foo is range 0 .. 100 with Size => 32;"). This is the Ada philosophy. This is how it should be done. Therefore, replacing "Integer" with "Int_32" is not helping newcomers to develop Ada programs, they will just develop C programs in Ada. In my point of view, this is an anti-feature (Ada--).
- Likes 1
Comment
-
Originally posted by pkunk View Post
It is actually dangerous, hard to understand and weird:
The analyzer detected an expression leading to undefined behavior. A variable is used several times between two sequence points while its value is changing. We cannot predict the result of such an...
https://github.com/apple/swift-evolution/blob/master/proposals/0004-remove-pre-post-inc-decrement.mdThis maintains proposals for changes and user-visible enhancements to the Swift Programming Language. - apple/swift-evolution
The hyperlinks you provided were insightful. I never realized these operators were so bad.
- Likes 1
Comment
-
Originally posted by pkunk View Post
It is actually dangerous, hard to understand and weird:
The analyzer detected an expression leading to undefined behavior. A variable is used several times between two sequence points while its value is changing. We cannot predict the result of such an...
https://github.com/apple/swift-evolution/blob/master/proposals/0004-remove-pre-post-inc-decrement.mdThis maintains proposals for changes and user-visible enhancements to the Swift Programming Language. - apple/swift-evolution
The drawbacks in C and C++ are, mostly, related to memory management on some specific cases where it's handling by hand is tricky.
Much worst for me is what Python did, stimulating careless use of dynamical types, the result is that lots of projects became a nightmare to maintain. Not that the language is bad, the problem is that many started to use it unknowing/unaware of the implications on huge code bases, it is fine for small ones. Same with Javascript.
Every developer should start first taking a course about "Selecting the right tools to specific jobs.", like it is done on every other engineering field.
Comment
Comment