Comments (13)
Looking into this further I see that the vtk now writes in binary format.
I believe this segfault only happens for certain meshes. I successfully wrote a quadratic triangle element to binary format and loaded it with ParaView using the same build of PUMI with a different mesh my code built from scratch
from core.
Confirm that bug depends on mesh. I works for triangles, here is test code below, not a working example because you will need some files I wrote:
MeshBuilder.h has a bunch of utility functions to build some simple meshes
MeshBuilder.cc implementation
#include <stdint.h>
#include <stdio.h>
#include <iostream>
#include <iomanip>
#include <cmath>
#include <cassert>
#include <apf.h>
#include <apfMesh2.h>
#include <apfMesh.h>
#include <gmi_mesh.h>
#include <gmi_null.h>
#include <apfShape.h>
#include <gmi_mesh.h>
#include <gmi_null.h>
#include <apfMDS.h>
#include <PCU.h>
#include <apfNumbering.h>
#include "MeshBuilder.h"
#include <unistd.h>
#define SECRET_BUILDER_NUMBERING "SecretBuilderNumbering"
int main(int argc, char *argv[])
{
MPI_Init(&argc,&argv);
PCU_Comm_Init();
apf::Mesh2* mesh = NULL;
MeshBuilder* mesh_builder = new MeshBuilder();
mesh_builder->build2DRectTriMesh(mesh, 2, 1, 0, 0, 1, 1);
std::cout << "end of my code" << std::endl;
usleep(1000000);
apf::writeVtkFiles("tri_test", mesh);
PCU_Comm_Free();
MPI_Finalize();
return 0;
}
from core.
I think @paulrevere4 is the most qualified to comment on this. I see he has made a lot of changes to apfVtk.cc since Nov 17.
from core.
Hi, @YetAnotherMinion I will look into this issue. Until it is resolved try using apf::writeASCIIVtkFiles as that will avoid using the changes I made recently.
from core.
@paulrevere4 Running git bisect identifies
ca80bf450d4da92b577d89b81373f0feb2b4f3f0 is the first bad commit
commit ca80bf450d4da92b577d89b81373f0feb2b4f3f0
Author: paulrevere4 <[email protected]>
Date: Tue Nov 17 15:09:19 2015 -0500
Switched writeVtkFiles and writeOneVtkFile to write with binary encoding
Also made a file to call ascii encoding
:040000 040000 0e0361b4306ea83d82d5dbad3e22a59860ffa985 02ebff0e28611d211ddc5ba341c9d5837217de91 M apf
:040000 040000 63a3799e7292b87ca19959f1d3ed30c546df07df 8b2d8075e22e47c5c0ba2ceea7234d1c0cce1f59 M test
from core.
Also thanks for the heads up about the writeASCIIVtkFiles , I will switch for now.
from core.
@YetAnotherMinion I ran the first test case that you provided and I had no issues. apf::writeVtkFiles generated the files for paraview without an issue.
Here is the output when I run it
$ mpirun ./bug
mesh verified in 0.000014 seconds
end of my code
writeVtuFile into buffers: 0.000211 seconds
writeVtuFile buffers to disk: 0.001144 seconds
vtk files batman_elm written in 0.002612 seconds
Here is my version of mpirun
$ mpirun --version
HYDRA build details:
Version: 3.1.2
Release Date: Mon Jul 21 16:00:21 CDT 2014
CC: gcc
CXX: g++
F77: gfortran
F90: gfortran
Configure options: '--disable-option-checking' '--prefix=/usr/local/mpich3/3.1.2-thread-multiple' '--enable-threads=multiple' '--cache-file=/dev/null' '--srcdir=.' 'CC=gcc' 'CFLAGS= -O2' 'LDFLAGS= ' 'LIBS=-lrt -lpthread ' 'CPPFLAGS= -I/usr/src/mpich-3.1.2/src/mpl/include -I/usr/src/mpich-3.1.2/src/mpl/include -I/usr/src/mpich-3.1.2/src/openpa/src -I/usr/src/mpich-3.1.2/src/openpa/src -D_REENTRANT -I/usr/src/mpich-3.1.2/src/mpi/romio/include'
Process Manager: pmi
Launchers available: ssh rsh fork slurm ll lsf sge manual persist
Topology libraries available: hwloc
Resource management kernels available: user slurm ll lsf sge pbs cobalt
Checkpointing libraries available:
Demux engines available: poll select
I will still look to see why the issue is occurring. Like I said, feel free to use the apf::writeASCIIVtkFiles function as that will write the vtk files you're looking for and skip the changes I have recently made for compression and binary formatting.
from core.
Its been a while since last activity, so I'm closing this.
Reopen if you have a program that actually reproduces
the failure for us.
from core.
It has been two days, also there is a complete working program to trigger the bug in the very first comment. Paul already said he was looking into it. Being an undergraduate I imagine that his schedule does not allow for immediate debugging.
from core.
Paul and I ran that program together yesterday. It worked fine and Valgrind pointed out no issues. So unless you provide something more, we can't reproduce the segfault and you'll have to trace it yourself.
from core.
That is interesting, it appears that we are using the same version major and minor version of MPI. I believe I was getting the segfault on travis as well as my local machine, so I will attempt to reproduce on a travis worker.
from core.
For what its worth, we were eventually able to reproduce this and fix it.
It was a loop-bound unsigned wraparound that would only be triggered
for meshes with two or fewer elements.
Why this only segfaults with a particular build configuration, including
shared libraries, is beyond me.
67b1aa6#diff-3ce282682957447754aa33baa75bc240R140
from core.
I am glad that this puzzle was solved. I was actually unable to reproduce the very next day on my machine, which made me wonder if I was going crazy...
from core.
Related Issues (20)
- chef: geombc boundary element 'face type' string shouldn't be in the interior block tag
- building develop with Simmetrix SimModSuite fails with pMesh symbol conflict HOT 1
- PUMI 2.2.8 Release HOT 1
- verify does not understand geometric models that have entities without mesh entities HOT 1
- node order for 10-node tet
- Build fails with PHASTA code HOT 16
- Compilation issues on Ubuntu 22.04 HOT 9
- Steady increase in memory use when running createMdsMesh HOT 1
- ctest failures on rhel9 system
- compiling with `fatal error: mpi.h: No such file or directory` HOT 1
- Memory leaks in capstone tests
- PUMI 2.2.9 Release
- CreateMG and Python support interfere with each other HOT 2
- Build fails on non-Linux systems: error: variable has incomplete type 'struct mallinfo' HOT 4
- 3 tests fail HOT 2
- gmsh.cc: add test that passes an input .dmg model HOT 2
- Wrong vertex classification on a partitioned mesh starting from a gmsh generated mesh HOT 6
- use netcdf find_package via pkg-config
- Expecting one or two elements per part from split and zsplit HOT 2
- api for evaluation of fields in element HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from core.