Comments (5)
As mentioned, the difference between macros and comp time is that macros are specified as running as a separate program.
A comp time expression needs to be prevented from using any global non-constant/comptime state.
If a comptime expression, fx, increments a global variable, then it's not valid as a constant. compile-time operations must not have side effects, because those side effects may be important to keeping the result valid.
If a comptime expresssion reads a global variable, then that variable must have a constant value.
Generally, a comptime operation can only invoke other comptime operations, and currently there are none.
We'd have to annotate the platform APIs with operations that are comptime compatible, but in an interface based language, that's not a type-based thing. You can't do for (var i in something)
unless all subtypes of the type of something
have comptime
declared get iterator
which return values that are mutable (iterators have mutable state) that are still guaranteed to be comptime
compatible, meaning it must not touch global state.
Dart is not designed to make global side effects visible in an API.
Imagine if some class's toString
updated a global variable, so you couldn't be allowed to call Object.toString
in general. Well, it happens that List.toString
does exactly that, it uses a global stack for detecting cycles, also shared with other iterables. Might even be the same one used by jsonEncode
.
That code could probably be rewritten to not (re)use a global variable, but ... The alternative would be during the stack in a zone, which is still mutable state external to the function, and blue it's updating Zone._current
instead.
Which brings us to async
, which is just one big heap of global state. That too could probably be rewritten to be more self-contained, so async computations can be made to run to completion at comptime and not leave lasting changes, but it's not something the API was designed for.
from language.
We did experiment with what we called "enhanced const", which would expose all the synchronous parts of the language to the const evaluator (basically giving you what you are asking for here but without needing the explicit compliment keyword/annotation, and no async support).
It ends up being quite complicated to support well because of the invalidation semantics that are implied. Any function invoked at compile time invalidates anything that depends on it whenever the body of that function changes, and the same goes for any functions used by that function, etc.
Adding the comptime annotation would assist with that in some ways - making it explicit what is allowed to be used at compile time, and thus which things have the worse invalidation/compilation time behavior. We also considered similar approaches, but determined it would inevitably lead to requests to any sufficiently used package to add @comptime
to its functions, and you end up in the same situation.
from language.
It ends up being quite complicated to support well because of the invalidation semantics that are implied. Any function invoked at compile time invalidates anything that depends on it whenever the body of that function changes, and the same goes for any functions used by that function, etc.
To be fair, the same happens for macros too. A macro executes code from its dependencies, so depends not just on API, but on full code. So, the kernel is recompiled on any code change. From this, API of any library that applies a macro depends on full code of the macro and any macro dependencies. So, the analyzer rebuilds element models for libraries with macro applications even if some changes are in function bodies.
Speaking of macros, and the original request of this issue. Macros are already async, and while I think we don't want you to use IO during macros, technically I think it is possible right now :-)
from language.
The difference with macros is the boundary is more well defined and we expect them to be written less often. If any constant expression could call functions, that would likely get used a ton, and so we would have these invalidation semantics much more often.
from language.
Speaking of macros, and the original request of this issue. Macros are already async, and while I think we don't want you to use IO during macros, technically I think it is possible right now :-)
It is currently possible I think yes, but it is specified that it is not allowed :). We also don't have the replacement (Resource) API implemented, and it isn't the end of the world if people use dart:io right now, if they are just reading files.
from language.
Related Issues (20)
- Impact of adding private members HOT 14
- Sensitive object logging redaction HOT 8
- Proposals to be able to perform data analysis and AI models using Dart HOT 3
- (Don't!) Reify an intersection type as the run-time type of the promoted variable's value HOT 1
- Macro API can't generate new types based on methods and properties from the base class HOT 7
- Class inherit from generic parameter type? HOT 3
- Parameter default scopes HOT 129
- Testing for type parameter bounds HOT 5
- `identical`-based promotion
- Allow extension type to implement Record and Function types
- No implicit `dynamic` HOT 8
- if-null operator disables type checking HOT 1
- The `base` keyword is problematic HOT 4
- Design for introspection on macro metadata / annotations HOT 7
- Unify "agumentation libraries" and part files HOT 1
- Augmentations in Part files with imports, "augmentation path property" HOT 3
- Would it be possible to generate and combine code from different macros invocations in one place? HOT 4
- Target of URI doesn't exist HOT 1
- Consider running macros in rounds instead of phases HOT 7
- Inheritance and shadowing between parent files and part files. HOT 5
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from language.