THREDDS-5.4-SNAPSHOT from 05-13-2022
CentOS 7, AdoptOpenJDK 11.0.15+10, Tomcat 8.5.75
I am attempting to upgrade an extensive collection of catalogs from version THREDDS 4.6.20. The first time I started THREDDS 5.4 catalina.out
complained the following .jar files were missing:
- stax-api-1.0.1.jar
- chronicle-analytics-2.21ea0.jar
- chronicle-core-2.21ea25.jar
I looked in webapps/thredds##5.4-SNAPSHOT/WEB-INF/lib
and indeed those files are missing. It didn't stop THREDDS from loading but I noticed that none of my FMRC datasets were loading correctly. I checked fmrc.log
and found several errors like the following:
[2022-05-26T16:22:36.103-0400] INFO thredds.featurecollection.InvDatasetFeatureCollection: FeatureCollection added = FeatureCollectionConfig name ='Averages' collectionName='doppio_2017_da_avg' type='FMRC'
spec='/home/om/dods-data/thredds/roms/doppio/2017_da/avg/doppio_avg_#yyyyMMdd_HHmm#.*\.nc$'
timePartition =directory
updateConfig =UpdateConfig{userDefined=true, recheckAfter='null', rescan='null', triggerOk=true, updateType=test}
tdmConfig =UpdateConfig{userDefined=false, recheckAfter='null', rescan='null', triggerOk=true, updateType=test}
ProtoConfig{choice=Latest, change='null', param='null', outerNcml='[Element: <netcdf [Namespace: http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2]/>]', cacheAll=true}
hasInnerNcml =false
fmrcConfig =FmrcConfig: regularize=false datasetTypes=[TwoD, Files, Runs, ConstantOffsets]best = (Best, 25.000000)
[2022-05-26T16:24:37.608-0400] ERROR ucar.nc2.ft.fmrc.Fmrc: makeFmrcInv
com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalStateException: ChronicleMap{name=GridDatasetInv, file=/opt/tds/servers/thredds_test/content/thredds/cache/collection/GridDatasetInv.da
t, identityHashCode=667994454}: Attempt to allocate #3 extra segment tier, 2 is maximum.
Possible reasons include:
- you have forgotten to configure (or configured wrong) builder.entries() number
- same regarding other sizing Chronicle Hash configurations, most likely maxBloatFactor(), averageKeySize(), or averageValueSize()
- keys, inserted into the ChronicleHash, are distributed suspiciously bad. This might be a DOS attack
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2051) ~[guava-30.1-jre.jar:?]
at com.google.common.cache.LocalCache.get(LocalCache.java:3951) ~[guava-30.1-jre.jar:?]
at com.google.common.cache.LocalCache$LocalManualCache.get(LocalCache.java:4848) ~[guava-30.1-jre.jar:?]
at ucar.nc2.ft.fmrc.GridDatasetInv.open(GridDatasetInv.java:62) ~[cdm-core-5.5.3-SNAPSHOT.jar:5.5.3-SNAPSHOT]
at ucar.nc2.ft.fmrc.Fmrc.makeFmrcInv(Fmrc.java:287) [cdm-core-5.5.3-SNAPSHOT.jar:5.5.3-SNAPSHOT]
at ucar.nc2.ft.fmrc.Fmrc.update(Fmrc.java:222) [cdm-core-5.5.3-SNAPSHOT.jar:5.5.3-SNAPSHOT]
at thredds.featurecollection.InvDatasetFcFmrc.updateCollection(InvDatasetFcFmrc.java:140) [classes/:5.4-SNAPSHOT]
at thredds.featurecollection.InvDatasetFeatureCollection.checkState(InvDatasetFeatureCollection.java:280) [classes/:5.4-SNAPSHOT]
at thredds.featurecollection.InvDatasetFcFmrc.makeCatalog(InvDatasetFcFmrc.java:175) [classes/:5.4-SNAPSHOT]
at thredds.core.CatalogManager.makeDynamicCatalog(CatalogManager.java:118) [classes/:5.4-SNAPSHOT]
at thredds.core.CatalogManager.getCatalog(CatalogManager.java:75) [classes/:5.4-SNAPSHOT]
at thredds.server.catalogservice.CatalogServiceController.handleRequest(CatalogServiceController.java:55) [classes/:5.4-SNAPSHOT]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:566) ~[?:?]
at org.springframework.web.method.support.InvocableHandlerMethod.doInvoke(InvocableHandlerMethod.java:205) [spring-web-5.3.19.jar:5.3.19]
at org.springframework.web.method.support.InvocableHandlerMethod.invokeForRequest(InvocableHandlerMethod.java:150) [spring-web-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.mvc.method.annotation.ServletInvocableHandlerMethod.invokeAndHandle(ServletInvocableHandlerMethod.java:117) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.invokeHandlerMethod(RequestMappingHandlerAdapter.java:895) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerAdapter.handleInternal(RequestMappingHandlerAdapter.java:808) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.mvc.method.AbstractHandlerMethodAdapter.handle(AbstractHandlerMethodAdapter.java:87) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:1067) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:963) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:1006) [spring-webmvc-5.3.19.jar:5.3.19]
at org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:898) [spring-webmvc-5.3.19.jar:5.3.19]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:655) [servlet-api.jar:?]
at org.springframework.web.servlet.FrameworkServlet.service(FrameworkServlet.java:883) [spring-webmvc-5.3.19.jar:5.3.19]
at javax.servlet.http.HttpServlet.service(HttpServlet.java:764) [servlet-api.jar:?]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:231) [catalina.jar:8.5.75]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [catalina.jar:8.5.75]
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:52) [tomcat-websocket.jar:8.5.75]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) [catalina.jar:8.5.75]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [catalina.jar:8.5.75]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:327) [spring-security-web-5.6.1.jar:5.6.1]
at thredds.servlet.filter.RequestBracketingLogMessageFilter.doFilter(RequestBracketingLogMessageFilter.java:50) [classes/:5.4-SNAPSHOT]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) [spring-security-web-5.6.1.jar:5.6.1]
at thredds.servlet.filter.RequestQueryFilter.doFilter(RequestQueryFilter.java:90) [classes/:5.4-SNAPSHOT]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) [spring-security-web-5.6.1.jar:5.6.1]
at thredds.servlet.filter.HttpHeadFilter.doFilter(HttpHeadFilter.java:47) [classes/:5.4-SNAPSHOT]
at org.springframework.security.web.FilterChainProxy$VirtualFilterChain.doFilter(FilterChainProxy.java:336) [spring-security-web-5.6.1.jar:5.6.1]
at org.springframework.security.web.FilterChainProxy.doFilterInternal(FilterChainProxy.java:211) [spring-security-web-5.6.1.jar:5.6.1]
at org.springframework.security.web.FilterChainProxy.doFilter(FilterChainProxy.java:183) [spring-security-web-5.6.1.jar:5.6.1]
at org.springframework.web.filter.DelegatingFilterProxy.invokeDelegate(DelegatingFilterProxy.java:354) [spring-web-5.3.19.jar:5.3.19]
at org.springframework.web.filter.DelegatingFilterProxy.doFilter(DelegatingFilterProxy.java:267) [spring-web-5.3.19.jar:5.3.19]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) [catalina.jar:8.5.75]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [catalina.jar:8.5.75]
at org.apache.logging.log4j.web.Log4jServletFilter.doFilter(Log4jServletFilter.java:71) [log4j-web-2.17.1.jar:2.17.1]
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:193) [catalina.jar:8.5.75]
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:166) [catalina.jar:8.5.75]
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:196) [catalina.jar:8.5.75]
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:97) [catalina.jar:8.5.75]
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:542) [catalina.jar:8.5.75]
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:135) [catalina.jar:8.5.75]
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:81) [catalina.jar:8.5.75]
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:698) [catalina.jar:8.5.75]
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:78) [catalina.jar:8.5.75]
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:366) [catalina.jar:8.5.75]
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:639) [tomcat-coyote.jar:8.5.75]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:65) [tomcat-coyote.jar:8.5.75]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:847) [tomcat-coyote.jar:8.5.75]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1680) [tomcat-coyote.jar:8.5.75]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) [tomcat-coyote.jar:8.5.75]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) [tomcat-util.jar:8.5.75]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) [tomcat-util.jar:8.5.75]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) [tomcat-util.jar:8.5.75]
at java.lang.Thread.run(Thread.java:829) [?:?]
Caused by: java.lang.IllegalStateException: ChronicleMap{name=GridDatasetInv, file=/opt/tds/servers/thredds_test/content/thredds/cache/collection/GridDatasetInv.dat, identityHashCode=667994454}: Attempt to allocate
#3 extra segment tier, 2 is maximum.
Possible reasons include:
- you have forgotten to configure (or configured wrong) builder.entries() number
- same regarding other sizing Chronicle Hash configurations, most likely maxBloatFactor(), averageKeySize(), or averageValueSize()
- keys, inserted into the ChronicleHash, are distributed suspiciously bad. This might be a DOS attack
at net.openhft.chronicle.hash.impl.VanillaChronicleHash.allocateTier(VanillaChronicleHash.java:822) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.impl.CompiledMapQueryContext.nextTier(CompiledMapQueryContext.java:3131) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.impl.CompiledMapQueryContext.alloc(CompiledMapQueryContext.java:3492) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.impl.CompiledMapQueryContext.initEntryAndKey(CompiledMapQueryContext.java:3510) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.impl.CompiledMapQueryContext.putEntry(CompiledMapQueryContext.java:4003) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.impl.CompiledMapQueryContext.doInsert(CompiledMapQueryContext.java:4192) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.MapEntryOperations.insert(MapEntryOperations.java:153) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.impl.CompiledMapQueryContext.insert(CompiledMapQueryContext.java:4115) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.MapMethods.put(MapMethods.java:88) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at net.openhft.chronicle.map.VanillaChronicleMap.put(VanillaChronicleMap.java:856) ~[chronicle-map-3.21ea6.jar:3.21ea6]
at thredds.featurecollection.cache.GridInventoryCacheChronicle.put(GridInventoryCacheChronicle.java:89) ~[classes/:5.4-SNAPSHOT]
at ucar.nc2.ft.fmrc.GridDatasetInv$GenerateInv.call(GridDatasetInv.java:112) ~[cdm-core-5.5.3-SNAPSHOT.jar:5.5.3-SNAPSHOT]
at ucar.nc2.ft.fmrc.GridDatasetInv$GenerateInv.call(GridDatasetInv.java:68) ~[cdm-core-5.5.3-SNAPSHOT.jar:5.5.3-SNAPSHOT]
at com.google.common.cache.LocalCache$LocalManualCache$1.load(LocalCache.java:4853) ~[guava-30.1-jre.jar:?]
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3529) ~[guava-30.1-jre.jar:?]
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2278) ~[guava-30.1-jre.jar:?]
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2155) ~[guava-30.1-jre.jar:?]
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2045) ~[guava-30.1-jre.jar:?]
... 66 more
I tried symlinking the chronical-core that is included with THREDDS (ln -s chronicle-core-2.21ea28.jar chronicle-core-2.21ea25.jar
) and downloading the other two missing jars to webapps/thredds##5.4-SNAPSHOT/WEB-INF/lib
. I restarted the tomcat instance and the catalina.out
no longer issued the warnings but it made no difference with my FMRC datasets; the same errors appear in the fmrc.log.
I then stripped out all other datasets and made my entire catalog only one FMRC with 9920 files. This failed in the exact same way. However, when I started with a smaller number of files then added month by month, sending a trigger to update in between each addition, the FMRC dataset was created successfully. I was able to build the dataset by adding about 1 year (~2190 files) at a time with a trigger in between each add.
However, when I tried to add 2 years (~4380) then send a trigger, the error above reappeared. Once the error occurs, the trigger no longer has any effect and it will not try to rebuild the dataset so I can continue accessing the dataset without the 2 years of files I tried to add. Once I restart the tomcat instance, however, accessing the dataset throws the same errors in the fmrc.log and also throws errors in the web interface.
For this dataset I am activating TwoD, Runs, ConstantOffsets, Files, and Best. When I click the OpenDAP link for the Best or Forecast Model Run Collection (2D time coordinates) dataset I get the following error:
Error {
code = 500;
message = "null";
};
Clicking Forcast Model Run results in:
FileNotFound: test01/runs/catalog.xml
Clicking Constant Forecast Offset results in:
FileNotFound: test01/offset/catalog.xml
The Files dataset is the only one that works.
The catalog is as follows:
<?xml version="1.0" encoding="UTF-8"?>
<catalog name="DMCS Catalog"
xmlns="http://www.unidata.ucar.edu/namespaces/thredds/InvCatalog/v1.0"
xmlns:xlink="http://www.w3.org/1999/xlink">
<service name="all" serviceType="Compound" base="">
<service name="thisDODS" serviceType="OpenDAP" base="/thredds/dodsC/" />
<service name="http" serviceType="HTTPServer" base="/thredds/fileServer/" />
<service name="wms" serviceType="WMS" base="/thredds/wms/" />
<service name="ncss" serviceType="NetcdfSubset" base="/thredds/ncss/grid/" />
</service>
<dataset name="ROMS doppio Real-Time Operational PSAS Forecast System Version 1 2017-present">
<featureCollection name="Averages"
featureType="FMRC"
harvest="true"
path="test01">
<metadata inherited="true">
<serviceName>all</serviceName>
<dataFormat>netCDF</dataFormat>
<documentation type="summary">
FMRC datasets for ROMS doppio real-time operational PSAS forecast system version 1 averages
</documentation>
</metadata>
<collection spec="/path/to/data/tst/doppio_avg_#yyyyMMdd_HHmm#.*\.nc$"
name="doppio_2017_da_avg" />
<update startup="test" trigger="allow" />
<protoDataset choice="Latest">
<netcdf xmlns="http://www.unidata.ucar.edu/namespaces/netcdf/ncml-2.2" location="nc/cldc.mean.nc" />
</protoDataset>
<fmrcConfig regularize="false" datasetTypes="TwoD Runs ConstantOffsets Files" >
<dataset name="Best" offsetsGreaterEqual="25" />
</fmrcConfig>
</featureCollection>
</dataset>
</catalog>