Coder Social home page Coder Social logo

Comments (13)

acferrad avatar acferrad commented on August 19, 2024 1

yes great!

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024

I think there are two separate things here.

When I build and run example_mnist, it appears to do all the training successfully, but does not appear to save or write its results anywhere.

Indeed, we don't save the network to file after training it in the MNIST example. Sure, we could add it, it'd be just call net % save('mnist_network.txt') right before the end of the program. Saving the network is shown separately here, so I didn't include it in example_mnist. Is saving the network what you'd like to see in the MNIST example?

When I build and run test_mnist, it just reads the test results from an old DAT file which comes with the GIT download.

Yes, this just tests for correct reading of features and labels data file, nothing more.

It would be useful for example_mnist to recreate this test DAT file so that the user can compare it to the downloaded version to make sure all is working correctly.

That's what example_mnist is meant to do, we're not re-creating the original data (features and labels), we are training a network using that data.

Or are you saying that we should do a roundtrip read/write/read of the original data to confirm it's being read correctly? πŸ˜•

from neural-fortran.

acferrad avatar acferrad commented on August 19, 2024

This is what I was thinking the mnist example was doing before I looked into the code:

example_mnist.exe would train the neural net using the data in mnist_training*.dat, and pass that trained information onto test_minst.exe so that it could use that information to predict digits from new images, to see how good the trained neural net was. This is what I would be doing if I were to use this code in another application.

However test_mnist.exe is not doing this: it is merely reading trained, test and validation data from a run done in the past, and printing it out.

I did make some changes to the code to do this: at the end of example_mnist.f90, I added:

! save network 1 to file
  call net % save('my_mnist.txt')

and then in test_mnist.f90 I added:

  call net % load('my_mnist.txt')
  call net % set_activation('sigmoid')
  do ii = 1, size(te_images,2)
    print *, ii, 'guess: ', maxloc(net % output(te_images(:,ii))) - 1
    call print_image(te_images, te_labels, ii)
  enddo

I couldn't find exact examples of how to do this in the code, I hope what I have done is correct.

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024

However test_mnist.exe is not doing this: it is merely reading trained, test and validation data from a run done in the past, and printing it out.

No, test_mnist is loading the source data (images and labels), not any network data. You can safely ignore this test and example -- it only tests that the subroutine loads MNIST images and labels correctly. It has nothing to do with previously trained networks.

To save and load network weights and biases, use net % save() and net % load(), like in example_save_and_load.

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024
  call net % load('my_mnist.txt')
  call net % set_activation('sigmoid')
  do ii = 1, size(te_images,2)
    print *, ii, 'guess: ', maxloc(net % output(te_images(:,ii))) - 1
    call print_image(te_images, te_labels, ii)
  enddo

I think I'm beginning to understand what you're trying to do here -- for each network prediction, print out the feature to the screen?

If yes, this seems to me like it should work. However I still don't get why you'd wanna do this -- you can just print te_labels(ii) to the screen next to your prediction?

from neural-fortran.

acferrad avatar acferrad commented on August 19, 2024

I think I'm beginning to understand what you're trying to do here -- for each network prediction, print out the feature to the screen?

Almost. I classify training the network and using those training results to predict the result of a test as two separate events, the first done once, the second multiple times for different inputs. Yes I need to print it out to the screen to see what its result is. I choose to print both the result and the test image it is trying to match, so I can see how good my trained net is.

If yes, this seems to me like it should work. However I still don't get why you'd wanna do this -- you can just print te_labels(ii) to the screen next to your prediction?

Because I want to test it on other data, not that generated from some test data in 2018. I have new data which I want to check whether my trained neural network will do a good job of it, not some old test data. After all this is the purpose of a neural network: to train on example data, and use that trained model to predict output based on (new) input.

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024

From your response, I understand that you want to visually inspect the input feature and the network output side-by-side. That's certainly valid but neither test_mnist nor example_mnist were meant to do that. test_mnist tests for reading of MNIST features and labels. example_mnist shows how to train and evaluate a network against the MNIST dataset.

If you'd like to write a short example of what you're doing and submit a PR, I'd be happy to review and merge.

from neural-fortran.

acferrad avatar acferrad commented on August 19, 2024

I'm not sure if what you say above is what I mean. I think what is needed is a "predict_nmist".

Similar to the following linear regression example:

  1. We have a set of (x,y) data points
  2. Fit a straight line to these data points
  3. Use the straight line to predict y at any value of x

In this analogy, points 1 and 2 above is what example_nmist does: it fits ("trains") the neural net to a set of data points (the training data). Point 3 above would be what predict_nmist should do: see how well the straight line (trained model) behaves with a new user input (a new handwritten number).

The visualization is not necessary, I just used it as it was there in the print_image subroutine. What is important is what digit the trained net thinks my new handwritten number is.

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024

OK, got it! Then your earlier snippet should work for that, for example (pretend that te_images and te_labels is your new data):

call net % load('my_mnist.txt')
call net % set_activation('sigmoid')
do ii = 1, size(te_labels)
  print *, 'Input: ', te_labels(ii), 'Prediction: ', maxloc(net % output(te_images(:,ii))) - 1
end do

So, to come full circle, you're suggesting we have an example showing how to do this (indeed -- example_mnist only prints the accuracy, but doesn't print individual predictions).

from neural-fortran.

tkdhss111 avatar tkdhss111 commented on August 19, 2024

Hi, miancurcic,

I compiled your program with cafrun mode, but it produced a few error messages at runtime: "Fortran runtime error on image 2: Unsupported data type in collective: 0". What is the problem here? Any idea? It runs with single mode well. Opencoaary library has been install with this instruction (https://www.scivision.dev/fortran-2018-coarray-quick-start/).

Hisashi

jma@z820:~/1_Projects/neural-fortran/build$ FC=caf cmake ..
-- The Fortran compiler identification is GNU 9.1.0
cc1: warning: command line option β€˜-fcoarray=lib’ is valid for Fortran but not for C
gfortran: warning: /usr/lib/x86_64-linux-gnu/libcaf_mpi.a: linker input file unused because linking not done
-- Check for working Fortran compiler: /usr/bin/caf
-- Check for working Fortran compiler: /usr/bin/caf -- works
-- Detecting Fortran compiler ABI info
-- Detecting Fortran compiler ABI info - failed
-- Checking whether /usr/bin/caf supports Fortran 90
-- Checking whether /usr/bin/caf supports Fortran 90 -- yes
-- Configuring build for 32-bit integers
-- Configuring build for 32-bit reals
-- Configuring done
-- Generating done
-- Build files have been written to: /home/jma/1_Projects/neural-fortran/build

jma@z820:~/1_Projects/neural-fortran/build$ make
Scanning dependencies of target neural
[ 4%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_kinds.f90.o
[ 8%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_activation.f90.o
[ 12%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_random.f90.o
[ 16%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_io.f90.o
[ 20%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_layer.f90.o
[ 24%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_parallel.f90.o
[ 28%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_mnist.f90.o
[ 32%] Building Fortran object CMakeFiles/neural.dir/src/lib/mod_network.f90.o
[ 36%] Linking Fortran static library lib/libneural.a
[ 36%] Built target neural
Scanning dependencies of target test_network_sync
[ 40%] Building Fortran object CMakeFiles/test_network_sync.dir/src/tests/test_network_sync.f90.o
[ 44%] Linking Fortran executable bin/test_network_sync
[ 44%] Built target test_network_sync
Scanning dependencies of target example_save_and_load
[ 48%] Building Fortran object CMakeFiles/example_save_and_load.dir/src/tests/example_save_and_load.f90.o
[ 52%] Linking Fortran executable bin/example_save_and_load
[ 52%] Built target example_save_and_load
Scanning dependencies of target test_network_save
[ 56%] Building Fortran object CMakeFiles/test_network_save.dir/src/tests/test_network_save.f90.o
[ 60%] Linking Fortran executable bin/test_network_save
[ 60%] Built target test_network_save
Scanning dependencies of target test_mnist
[ 64%] Building Fortran object CMakeFiles/test_mnist.dir/src/tests/test_mnist.f90.o
[ 68%] Linking Fortran executable bin/test_mnist
[ 68%] Built target test_mnist
Scanning dependencies of target example_mnist
[ 72%] Building Fortran object CMakeFiles/example_mnist.dir/src/tests/example_mnist.f90.o
[ 76%] Linking Fortran executable bin/example_mnist
[ 76%] Built target example_mnist
Scanning dependencies of target example_sine
[ 80%] Building Fortran object CMakeFiles/example_sine.dir/src/tests/example_sine.f90.o
[ 84%] Linking Fortran executable bin/example_sine
[ 84%] Built target example_sine
Scanning dependencies of target test_set_activation_function
[ 88%] Building Fortran object CMakeFiles/test_set_activation_function.dir/src/tests/test_set_activation_function.f90.o
[ 92%] Linking Fortran executable bin/test_set_activation_function
[ 92%] Built target test_set_activation_function
Scanning dependencies of target example_simple
[ 96%] Building Fortran object CMakeFiles/example_simple.dir/src/tests/example_simple.f90.o
[100%] Linking Fortran executable bin/example_simple
[100%] Built target example_simple

jma@z820:~/1_Projects/neural-fortran/build$ cafrun -n 4 bin/example_mnist
Fortran runtime error on image 2: Unsupported data type in collective: 0

Fortran runtime error on image 4: Unsupported data type in collective: 0


Primary job terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.

Fortran runtime error on image 1: Unsupported data type in collective: 0

Fortran runtime error on image 3: Unsupported data type in collective: 0


mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[52267,1],1]
Exit code: 1

Error: Command:
/usr/bin/mpiexec -n 4 bin/example_mnist
failed to run.

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024

Hi Hisashi, I haven't encountered this run-time error before, and can't reproduce it on my current set up (Ubuntu 18.10, gfortran-8.2.0, OpenCoarrays-2.3.1).

This could be a regression in a later version of either gfortran or OpenCoarrays. What version of OpenCoarrays are you using?

from neural-fortran.

tkdhss111 avatar tkdhss111 commented on August 19, 2024

from neural-fortran.

milancurcic avatar milancurcic commented on August 19, 2024

Closing as no longer applicable.

from neural-fortran.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.