Comments (12)
::: are part of the GNU parallel command and it should not be included in the argument if the script is run by itself. Parallel is a very useful utility, worth learning how to use.
from latplan.
I see, it seems you removed the entire line of $common too, which is why gnu parallel is not invoked
from latplan.
Hi @guicho271828, I'm working with @rabachi on trying to understand and replicate some of the Cube-Space AE experiments using the 4.1.3 code release. Thank you for your response earlier, it helped us streamline our investigations into the code.
For simplicity's sake we're just focusing on the MNIST 3x3 puzzle and are having a bit of a hard time parsing through the steps and scripts needed to fully replicate the code. Again, for simplicity I've tried to break down which scripts and commands are needed to replicate the results for MNIST 3x3 (ignoring the large batch commands that you've provided in the shell files). Please let me know if I've understood the expected procedure correctly.
- Generate the data using
./setup-dataset.sh
<-- This is very straightforward, thank you. - Train the Cube-Space AE for the MNIST 3x3 experiment using the best hyperparameter settings
python strips.py reproduce-planning puzzle mnist 3 3 5000 None None None False ConcreteDetNormalizedLogitAddEffectTransitionAE "planning"
- Extract the PDDL code
Here is the first place where I'm not very confident that I understand what is going on. Is it expected to execute the script
domain-actionlearner4.ros
inside thelisp
subfolder? How do we ensure that it's acting on the files generated and saved by the script executed in Step 2?
ros dynamic-space-size=8000 install guicho271828/magicffi guicho271828/dataloader
fails with error: "Attempting to install the scripts in roswell/ subdirectory of the system... No roswell scripts found." even though I’ve installed Roswell by following the instructions in./install.sh
Extracting PDDL:
lisp/domain-actionlearner4.bin
doesn’t exist, runninglisp/domain-actionlearner4.ros
throws the error that Component DATALOADER does not exist
- Generate problem instances running
python problem-instances/generate-dijkstra.py 7 30 puzzle mnist 3 3
- With the PDDL code and the generated problem instances, solve with various heuristic versions of A*
Here is the second place where I've gotten confused about what is expected. In the README for this repository there are 5 separate
run_ama*
scripts and I have had difficulty understanding which is needed to reproduce the MNIST 3x3 example. I presume that it's one of theama_*_planner.py
files but again, it's unclear how to specify the options and point to the various stored problem instances and PDDL files.
- Generate the tables / aggregate the results using some combination of
table4.sh
andperformance.sh
. Again, it's unclear how to specify that I only want to extract the performance for the MNIST 3x3 example.
Thanks again for helping guide us through this!
from latplan.
- lisp/domain-actionlearner4.bin doesn’t exist
you didn't run make
, or make
failed for some reason.
ros dynamic-space-size=8000 install guicho271828/magicffi guicho271828/dataloader fails with error
It is not failing. It is just reporting that magicffi and dataloader do not have any executable script. Some libraries have executable scripts with it, much like pip install tqdm
installs tqdm
in ~/.local/bin
. magicffi and dataloader have none.
there are 5 separate run_ama* scripts and I have had difficulty understanding which is needed to reproduce the MNIST 3x3 example
you are looking for run_ama3_all.sh , which submit multiple jobs to a job scheduler. Again you should change submit
variable for your job scheduler. If you don't have a job scheduler make it an empty string.
from latplan.
Thanks for your help again, we really appreciate it!
I have another question with respect to the Dataloader package.
I understand that the executable lisp/domain-actionlearner4.bin
is generated by running make
, but make is failing with an error.
I run the following steps:
- Install dataloader:
ros dynamic-space-size=8000 install guicho271828/magicffi guicho271828/dataloader
- Make:
make -j 1 -C lisp
. This is the error that I get:
Unhandled SB-C::INPUT-ERROR-IN-LOAD in thread #<SB-THREAD:THREAD "main thread" RUNNING
{11D76D0343}>:
READ error during LOAD:
Package DATALOADER does not exist.
Line: 230, Column: 27, File-Position: 9207
Stream: #<CONCATENATED-STREAM :STREAMS (#<SB-SYS:FD-STREAM for "file /h/abachiro/latplan-4.1.3/lisp/domain-actionlearner4.ros" {100B6C5033}>
This is why I thought dataloader installation was the problem but as you stated, "no roswell scripts found" is expected for this package.
Is there another reason this error might appear?
from latplan.
Ah... I think dependency has broken somewhere. Sorry, I did not notice this. It is my bug.
Please add dataloader
to the list
'(cl-csv iterate alexandria trivia.ppcre serializable-object dsama lparallel)
and make it
'(cl-csv iterate alexandria trivia.ppcre serializable-object dsama lparallel dataloader)
.
from latplan.
Great, thank you! After your fix, I was able to make
without error, and extract the PDDL file with train_others.sh
.
I am now running into a small error when trying to plan.
I run the following steps after the PDDL file extraction:
- Generate init and goal files:
(cd problem-instances; ./example-dijkstra.sh)
- git clone FastDownard repo,
cd downward
,./build.py -j $(cat /proc/cpuinfo | grep -c processor) release
- Planning:
python -W ignore ama3-planner.py "samples-planning/puzzle_mnist_3_3_5000_None_None_None_False_ConcreteDetNormalizedLogitAddEffectTransitionAE_planning/actionlearner4-3-actions_both+ids.csv-0-1.00.pddl" "problem-instances/vanilla/latplan.puzzles.puzzle_mnist/007-000-000/" "lmcut"
Here, I get the outputs simulated the plan
, parsed the plan
, decoded the plan
, plotted the plan
, validated the plan
so far so good.
During the json file dumping, on line 141: "statistics":json.loads(echo_out(["helper/fd-parser.awk", logfile]))
, I get the error:
json.decoder.JSONDecodeError: Expecting value: line 12 column 10 (char 172)
and the string causing the problem is: '{\n"variables":37,\n"derived_variables":0,\n"facts":74,\n"goal_facts":37,\n"mutex_groups":0,\n"operators":249,\n"axioms":0,\n"task":5229,\n"peak":67868,\n"translate":2.577,\n"length":KB],\n"cost":KB],\n"expanded":22824,\n"evaluated":22824,\n"generated":22824,\n"search":KB],\n"initialization":0,\n"total":2.577\n}\n'
It seems that the script helper/fd-parser.awk
is either parsing the logfile incorrectly or the logfile has been created incorrectly for length, cost,
and search
above.
Would you have any insight as to why this might be occurring? Could I be giving bad arguments to ama3_planner.py
?
Thank you for your help. I hope this is the last question we'll be bothering you with!
from latplan.
Apparently it is due to the change in how the latest FD outputs a log file. My local version has a fix for that problem. Fixing the awk file should be straightforward.
from latplan.
Ok thanks again, I fixed the awk file and now I have a question regarding how to interpret the output of generate_all_csv.sh
.
At the output I am getting the search
, initialization
, generated
, evaluated
, expanded
and total
csv files.
My question is: which of these files can I use to recreate the results in Table 2 of the CubeSpace paper? How do I determine the plans found, valid, and optimal for a given domain? I see that the results for each heuristic are given in the columns of the .csv files.
from latplan.
I don't know how much you skipped the lines in generate_all_csv, but it is counted inside check1.sh
from latplan.
Hi @guicho271828,
I also wanted to run cube-space LatPlan, however when it comes to
make -j 1 -C lisp, it produces the following error:
Makefile:5: recipe for target 'ama1-sas.bin'
Could you please let me know how I can solve it?
Thank you very much in advance,
Sincerely, Ulzhalgas Rakhman
from latplan.
Issues raised in this thread is resolved in the latest version 5 update.
from latplan.
Related Issues (16)
- lack of file latplan.py HOT 4
- Should I run setup again? HOT 9
- How to use the Cube-Space AE (with trained weights) to encode images HOT 2
- Issues with MagicFFI HOT 7
- grovelling +magic-version+ from MAGIC_VERSION in magic.h header file failed! HOT 14
- error while installing HOT 2
- some command line examples ?
- Compatibility issue + error HOT 1
- No definition of plot_transitions in network.py HOT 2
- Questions about train_propositional.sh HOT 1
- Hi, could you please explain more for your code? HOT 26
- No such file or directory: ...aux.json HOT 1
- Building Arrival validator HOT 2
- Can you provide an installation script for MacOS? HOT 10
- Could you please add more information for ZSAE? HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from latplan.