Comments (6)
You may add enable-fp16 build test in CI.
You may limit test coverage or error reporting mechanisms for enable-fp16, too. (you can let github-action gracefully ignore failed TCs, too, so that we can see the number of failed cases but the CI says "OK".)
Anyone needing this may add it. (@heka1024 ?)
from nntrainer.
@skykongkong8 @myungjoo I'll add this to CI. (As non-required test) Thanks for comment!
from nntrainer.
Good point.
As far as I am concerned, Reason why -Denable-fp16
is not included in current CI is because it is relatively newer option than the others.
Even for now, I personally think that it is not really matured to be checked for every PR, because we have multiple plans to refactorize the structure of half-precision formats or related codes.. I need more opinions about this. @jijoongmoon
from nntrainer.
cibot: Thank you for posting issue #2560. The person in charge will reply soon.
from nntrainer.
@skykongkong8 What do you think?
from nntrainer.
Great! Thanks for pointing this issue out :)
from nntrainer.
Related Issues (20)
- Channel Last Tensor save/read fails occasionally HOT 1
- Random Idea for Future Features: G-LoRA on NNTrainer. HOT 1
- Build fails with `Dplatform=android` HOT 1
- Knowledge Embedding Interface Specification for RAG in NNTrainer HOT 2
- Add Depthwise 2D Convolution Layer HOT 1
- Some confusion about random dataset HOT 8
- DRL algorithm with api HOT 8
- Issues and Questions about Execution of LLaMA using NNTrainer HOT 8
- [ Tensor ] Accelerate fp16 matrix transpose with SIMD HOT 3
- [ HGEMM ] Half-Precision GEMM Roadmap HOT 4
- Q&A] How to solve build failure for flatbuffers's Table. HOT 2
- Bug in `max_abs()` function in FP16 Tensor HOT 3
- Issue in running the resnet18 example. HOT 4
- Running examples on PC issue HOT 3
- Support hyper parameter for activation layer HOT 1
- Support loading weights from pytorch model HOT 1
- Add the calc_derivative and unittest of quick gelu HOT 1
- Support encoder on Ubuntu & Tizen HOT 1
- Support Convolution&BAtchnorm Fusing for Optimized Inference Mode HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from nntrainer.