Coder Social home page Coder Social logo

Comments (16)

zhengshou avatar zhengshou commented on September 3, 2024

Hi,

  1. experiments\THUMOS14\network_proposal\list_train_uniform16_proposal.lst:
    format: video_frame_directory start_frame_index class_label stepsize
    start_frame_index : action_starting_time ?(Is start_frame_index action_starting_time ? )

A: it is the index of the action start frame

stepsize: 8/16/32 ?(Is the stepsize random ?)

A: 16 up to 512. fixed multiscale window.

  1. experiments\THUMOS14\network_localization\list_train_uniform16_localization.lst:
    overlap ? (How to compute overlap ?)

A: we follow conventional detection methods to compute IoU as the measurement for overlap.

thanks. merry Xmas.

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

Thanks.merry Xmas.

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

training my model,extract.log:
WARNING: Logging before InitGoogleLogging() is written to STDERR
E1226 10:14:15.090147 19922 extract_image_features.cpp:55] Using GPU #0
I1226 10:14:15.091804 19922 net.cpp:65] Memory required for Data0
I1226 10:14:15.091856 19922 net.cpp:76] Creating Layer data
I1226 10:14:15.091868 19922 net.cpp:112] data -> data
I1226 10:14:15.091883 19922 net.cpp:112] data -> label
I1226 10:14:15.091897 19922 video_data_layer.cpp:283] Opening file list_test_uniform16_proposal.lst
I1226 10:14:15.092732 19922 video_data_layer.cpp:318] A total of 1282 video chunks.
I1226 10:14:15.092742 19922 video_data_layer.cpp:345] read video from dataset/test/diving_01/
I1226 10:14:15.485334 19922 video_data_layer.cpp:366] output data size: 20,3,16,112,112
I1226 10:14:15.485400 19922 video_data_layer.cpp:388] Loading mean file from../../../models/train01_16_128_171_mean.binaryproto
I1226 10:14:15.672116 19922 video_data_layer.cpp:417] Initializing prefetch
I1226 10:14:15.672245 19922 video_data_layer.cpp:419] Prefetch initialized.
I1226 10:14:15.672344 19922 net.cpp:127] Top shape: 20 3 16 112 112 (12042240)
I1226 10:14:15.672354 19922 net.cpp:127] Top shape: 20 1 1 1 1 (20)
I1226 10:14:15.672360 19922 net.cpp:136] Memory required for Data 48169040
I1226 10:14:15.672372 19922 net.cpp:159] data does not need backward computation.
I1226 10:14:15.672407 19922 net.cpp:76] Creating Layer conv1a
I1226 10:14:15.672420 19922 net.cpp:86] conv1a <- data
I1226 10:14:15.672444 19922 net.cpp:112] conv1a -> conv1a
I1226 10:14:15.673990 19922 net.cpp:127] Top shape: 20 64 16 112 112 (256901120)
I1226 10:14:15.674005 19922 net.cpp:136] Memory required for Data 1075773520
I1226 10:14:15.674011 19922 net.cpp:154] conv1a needs backward computation.
I1226 10:14:15.674024 19922 net.cpp:76] Creating Layer relu1a
I1226 10:14:15.674031 19922 net.cpp:86] relu1a <- conv1a
I1226 10:14:15.674047 19922 net.cpp:100] relu1a -> conv1a (in-place)
I1226 10:14:15.674057 19922 net.cpp:127] Top shape: 20 64 16 112 112 (256901120)
I1226 10:14:15.674062 19922 net.cpp:136] Memory required for Data 1075773520
I1226 10:14:15.674067 19922 net.cpp:154] relu1a needs backward computation.
I1226 10:14:15.674077 19922 net.cpp:76] Creating Layer pool1
I1226 10:14:15.674083 19922 net.cpp:86] pool1 <- conv1a
I1226 10:14:15.674091 19922 net.cpp:112] pool1 -> pool1
I1226 10:14:15.674105 19922 net.cpp:127] Top shape: 20 64 16 56 56 (64225280)
I1226 10:14:15.674111 19922 net.cpp:136] Memory required for Data 1332674640
I1226 10:14:15.674116 19922 net.cpp:154] pool1 needs backward computation.
I1226 10:14:15.674129 19922 net.cpp:76] Creating Layer conv2a
I1226 10:14:15.674136 19922 net.cpp:86] conv2a <- pool1
I1226 10:14:15.674144 19922 net.cpp:112] conv2a -> conv2a
I1226 10:14:15.697609 19922 net.cpp:127] Top shape: 20 128 16 56 56 (128450560)
I1226 10:14:15.697630 19922 net.cpp:136] Memory required for Data 1846476880
I1226 10:14:15.697636 19922 net.cpp:154] conv2a needs backward computation.
I1226 10:14:15.697644 19922 net.cpp:76] Creating Layer relu2a
I1226 10:14:15.697652 19922 net.cpp:86] relu2a <- conv2a
I1226 10:14:15.697661 19922 net.cpp:100] relu2a -> conv2a (in-place)
I1226 10:14:15.697671 19922 net.cpp:127] Top shape: 20 128 16 56 56 (128450560)
I1226 10:14:15.697679 19922 net.cpp:136] Memory required for Data 1846476880
I1226 10:14:15.697684 19922 net.cpp:154] relu2a needs backward computation.
I1226 10:14:15.697693 19922 net.cpp:76] Creating Layer pool2
I1226 10:14:15.697700 19922 net.cpp:86] pool2 <- conv2a
I1226 10:14:15.697707 19922 net.cpp:112] pool2 -> pool2
I1226 10:14:15.697717 19922 net.cpp:127] Top shape: 20 128 8 28 28 (16056320)
I1226 10:14:15.697723 19922 net.cpp:136] Memory required for Data 1910702160
I1226 10:14:15.697728 19922 net.cpp:154] pool2 needs backward computation.
I1226 10:14:15.697738 19922 net.cpp:76] Creating Layer conv3a
I1226 10:14:15.697744 19922 net.cpp:86] conv3a <- pool2
I1226 10:14:15.697751 19922 net.cpp:112] conv3a -> conv3a
I1226 10:14:15.798044 19922 net.cpp:127] Top shape: 20 256 8 28 28 (32112640)
I1226 10:14:15.798070 19922 net.cpp:136] Memory required for Data 2039152720
I1226 10:14:15.798089 19922 net.cpp:154] conv3a needs backward computation.
I1226 10:14:15.798102 19922 net.cpp:76] Creating Layer relu3a
I1226 10:14:15.798110 19922 net.cpp:86] relu3a <- conv3a
I1226 10:14:15.798120 19922 net.cpp:100] relu3a -> conv3a (in-place)
I1226 10:14:15.798127 19922 net.cpp:127] Top shape: 20 256 8 28 28 (32112640)
I1226 10:14:15.798135 19922 net.cpp:136] Memory required for Data 2039152720
I1226 10:14:15.798140 19922 net.cpp:154] relu3a needs backward computation.
I1226 10:14:15.798149 19922 net.cpp:76] Creating Layer conv3b
I1226 10:14:15.798156 19922 net.cpp:86] conv3b <- conv3a
I1226 10:14:15.798163 19922 net.cpp:112] conv3b -> conv3b
I1226 10:14:16.002578 19922 net.cpp:127] Top shape: 20 256 8 28 28 (32112640)
I1226 10:14:16.002604 19922 net.cpp:136] Memory required for Data 2167603280
I1226 10:14:16.002610 19922 net.cpp:154] conv3b needs backward computation.
I1226 10:14:16.002620 19922 net.cpp:76] Creating Layer relu3b
I1226 10:14:16.002627 19922 net.cpp:86] relu3b <- conv3b
I1226 10:14:16.002636 19922 net.cpp:100] relu3b -> conv3b (in-place)
I1226 10:14:16.002645 19922 net.cpp:127] Top shape: 20 256 8 28 28 (32112640)
I1226 10:14:16.002650 19922 net.cpp:136] Memory required for Data 2167603280
I1226 10:14:16.002655 19922 net.cpp:154] relu3b needs backward computation.
I1226 10:14:16.002665 19922 net.cpp:76] Creating Layer pool3
I1226 10:14:16.002671 19922 net.cpp:86] pool3 <- conv3b
I1226 10:14:16.002678 19922 net.cpp:112] pool3 -> pool3
I1226 10:14:16.002694 19922 net.cpp:127] Top shape: 20 256 4 14 14 (4014080)
I1226 10:14:16.002701 19922 net.cpp:136] Memory required for Data 2183659600
I1226 10:14:16.002706 19922 net.cpp:154] pool3 needs backward computation.
I1226 10:14:16.002717 19922 net.cpp:76] Creating Layer conv4a
I1226 10:14:16.002722 19922 net.cpp:86] conv4a <- pool3
I1226 10:14:16.002730 19922 net.cpp:112] conv4a -> conv4a
I1226 10:14:16.398593 19922 net.cpp:127] Top shape: 20 512 4 14 14 (8028160)
I1226 10:14:16.398636 19922 net.cpp:136] Memory required for Data 2215772240
I1226 10:14:16.398644 19922 net.cpp:154] conv4a needs backward computation.
I1226 10:14:16.398684 19922 net.cpp:76] Creating Layer relu4a
I1226 10:14:16.398705 19922 net.cpp:86] relu4a <- conv4a
I1226 10:14:16.398720 19922 net.cpp:100] relu4a -> conv4a (in-place)
I1226 10:14:16.398726 19922 net.cpp:127] Top shape: 20 512 4 14 14 (8028160)
I1226 10:14:16.398731 19922 net.cpp:136] Memory required for Data 2215772240
I1226 10:14:16.398736 19922 net.cpp:154] relu4a needs backward computation.
I1226 10:14:16.398749 19922 net.cpp:76] Creating Layer conv4b
I1226 10:14:16.398756 19922 net.cpp:86] conv4b <- conv4a
I1226 10:14:16.398762 19922 net.cpp:112] conv4b -> conv4b
I1226 10:14:17.064122 19922 net.cpp:127] Top shape: 20 512 4 14 14 (8028160)
I1226 10:14:17.064172 19922 net.cpp:136] Memory required for Data 2247884880
I1226 10:14:17.064179 19922 net.cpp:154] conv4b needs backward computation.
I1226 10:14:17.064208 19922 net.cpp:76] Creating Layer relu4b
I1226 10:14:17.064218 19922 net.cpp:86] relu4b <- conv4b
I1226 10:14:17.064234 19922 net.cpp:100] relu4b -> conv4b (in-place)
I1226 10:14:17.064241 19922 net.cpp:127] Top shape: 20 512 4 14 14 (8028160)
I1226 10:14:17.064246 19922 net.cpp:136] Memory required for Data 2247884880
I1226 10:14:17.064251 19922 net.cpp:154] relu4b needs backward computation.
I1226 10:14:17.064277 19922 net.cpp:76] Creating Layer pool4
I1226 10:14:17.064285 19922 net.cpp:86] pool4 <- conv4b
I1226 10:14:17.064291 19922 net.cpp:112] pool4 -> pool4
I1226 10:14:17.064301 19922 net.cpp:127] Top shape: 20 512 2 7 7 (1003520)
I1226 10:14:17.064306 19922 net.cpp:136] Memory required for Data 2251898960
I1226 10:14:17.064311 19922 net.cpp:154] pool4 needs backward computation.
I1226 10:14:17.064321 19922 net.cpp:76] Creating Layer conv5a
I1226 10:14:17.064326 19922 net.cpp:86] conv5a <- pool4
I1226 10:14:17.064332 19922 net.cpp:112] conv5a -> conv5a
I1226 10:14:17.715493 19922 net.cpp:127] Top shape: 20 512 2 7 7 (1003520)
I1226 10:14:17.715556 19922 net.cpp:136] Memory required for Data 2255913040
I1226 10:14:17.715575 19922 net.cpp:154] conv5a needs backward computation.
I1226 10:14:17.715598 19922 net.cpp:76] Creating Layer relu5a
I1226 10:14:17.715613 19922 net.cpp:86] relu5a <- conv5a
I1226 10:14:17.715628 19922 net.cpp:100] relu5a -> conv5a (in-place)
I1226 10:14:17.715636 19922 net.cpp:127] Top shape: 20 512 2 7 7 (1003520)
I1226 10:14:17.715641 19922 net.cpp:136] Memory required for Data 2255913040
I1226 10:14:17.715646 19922 net.cpp:154] relu5a needs backward computation.
I1226 10:14:17.715656 19922 net.cpp:76] Creating Layer conv5b
I1226 10:14:17.715662 19922 net.cpp:86] conv5b <- conv5a
I1226 10:14:17.715667 19922 net.cpp:112] conv5b -> conv5b
I1226 10:14:18.354892 19922 net.cpp:127] Top shape: 20 512 2 7 7 (1003520)
I1226 10:14:18.354936 19922 net.cpp:136] Memory required for Data 2259927120
I1226 10:14:18.354944 19922 net.cpp:154] conv5b needs backward computation.
I1226 10:14:18.354975 19922 net.cpp:76] Creating Layer relu5b
I1226 10:14:18.354986 19922 net.cpp:86] relu5b <- conv5b
I1226 10:14:18.355001 19922 net.cpp:100] relu5b -> conv5b (in-place)
I1226 10:14:18.355010 19922 net.cpp:127] Top shape: 20 512 2 7 7 (1003520)
I1226 10:14:18.355013 19922 net.cpp:136] Memory required for Data 2259927120
I1226 10:14:18.355017 19922 net.cpp:154] relu5b needs backward computation.
I1226 10:14:18.355031 19922 net.cpp:76] Creating Layer pool5
I1226 10:14:18.355036 19922 net.cpp:86] pool5 <- conv5b
I1226 10:14:18.355044 19922 net.cpp:112] pool5 -> pool5
I1226 10:14:18.355054 19922 net.cpp:127] Top shape: 20 512 1 4 4 (163840)
I1226 10:14:18.355059 19922 net.cpp:136] Memory required for Data 2260582480
I1226 10:14:18.355063 19922 net.cpp:154] pool5 needs backward computation.
I1226 10:14:18.355087 19922 net.cpp:76] Creating Layer fc6-1
I1226 10:14:18.355093 19922 net.cpp:86] fc6-1 <- pool5
I1226 10:14:18.355100 19922 net.cpp:112] fc6-1 -> fc6-1
I1226 10:14:21.460341 19922 net.cpp:127] Top shape: 20 4096 1 1 1 (81920)
I1226 10:14:21.460398 19922 net.cpp:136] Memory required for Data 2260910160
I1226 10:14:21.460430 19922 net.cpp:154] fc6-1 needs backward computation.
I1226 10:14:21.460458 19922 net.cpp:76] Creating Layer relu6
I1226 10:14:21.460477 19922 net.cpp:86] relu6 <- fc6-1
I1226 10:14:21.460494 19922 net.cpp:100] relu6 -> fc6-1 (in-place)
I1226 10:14:21.460501 19922 net.cpp:127] Top shape: 20 4096 1 1 1 (81920)
I1226 10:14:21.460505 19922 net.cpp:136] Memory required for Data 2260910160
I1226 10:14:21.460510 19922 net.cpp:154] relu6 needs backward computation.
I1226 10:14:21.460520 19922 net.cpp:76] Creating Layer drop6
I1226 10:14:21.460525 19922 net.cpp:86] drop6 <- fc6-1
I1226 10:14:21.460530 19922 net.cpp:100] drop6 -> fc6-1 (in-place)
I1226 10:14:21.460536 19922 net.cpp:127] Top shape: 20 4096 1 1 1 (81920)
I1226 10:14:21.460539 19922 net.cpp:136] Memory required for Data 2260910160
I1226 10:14:21.460543 19922 net.cpp:154] drop6 needs backward computation.
I1226 10:14:21.460558 19922 net.cpp:76] Creating Layer fc7-1
I1226 10:14:21.460564 19922 net.cpp:86] fc7-1 <- fc6-1
I1226 10:14:21.460571 19922 net.cpp:112] fc7-1 -> fc7-1
I1226 10:14:22.999430 19922 net.cpp:127] Top shape: 20 4096 1 1 1 (81920)
I1226 10:14:22.999480 19922 net.cpp:136] Memory required for Data 2261237840
I1226 10:14:22.999487 19922 net.cpp:154] fc7-1 needs backward computation.
I1226 10:14:22.999505 19922 net.cpp:76] Creating Layer relu7
I1226 10:14:22.999513 19922 net.cpp:86] relu7 <- fc7-1
I1226 10:14:22.999528 19922 net.cpp:100] relu7 -> fc7-1 (in-place)
I1226 10:14:22.999536 19922 net.cpp:127] Top shape: 20 4096 1 1 1 (81920)
I1226 10:14:22.999541 19922 net.cpp:136] Memory required for Data 2261237840
I1226 10:14:22.999544 19922 net.cpp:154] relu7 needs backward computation.
I1226 10:14:22.999552 19922 net.cpp:76] Creating Layer drop7
I1226 10:14:22.999557 19922 net.cpp:86] drop7 <- fc7-1
I1226 10:14:22.999563 19922 net.cpp:100] drop7 -> fc7-1 (in-place)
I1226 10:14:22.999572 19922 net.cpp:127] Top shape: 20 4096 1 1 1 (81920)
I1226 10:14:22.999575 19922 net.cpp:136] Memory required for Data 2261237840
I1226 10:14:22.999590 19922 net.cpp:154] drop7 needs backward computation.
I1226 10:14:22.999599 19922 net.cpp:76] Creating Layer fc8
I1226 10:14:22.999604 19922 net.cpp:86] fc8 <- fc7-1
I1226 10:14:22.999610 19922 net.cpp:112] fc8 -> fc8
I1226 10:14:23.000346 19922 net.cpp:127] Top shape: 20 2 1 1 1 (40)
I1226 10:14:23.000354 19922 net.cpp:136] Memory required for Data 2261238000
I1226 10:14:23.000357 19922 net.cpp:154] fc8 needs backward computation.
I1226 10:14:23.000363 19922 net.cpp:76] Creating Layer prob
I1226 10:14:23.000368 19922 net.cpp:86] prob <- fc8
I1226 10:14:23.000375 19922 net.cpp:112] prob -> prob
I1226 10:14:23.000383 19922 net.cpp:127] Top shape: 20 2 1 1 1 (40)
I1226 10:14:23.000388 19922 net.cpp:136] Memory required for Data 2261238160
I1226 10:14:23.000392 19922 net.cpp:154] prob needs backward computation.
I1226 10:14:23.000396 19922 net.cpp:165] This network produces output label
I1226 10:14:23.000401 19922 net.cpp:165] This network produces output prob
I1226 10:14:23.000434 19922 net.cpp:183] Collecting Learning Rate and Weight Decay.
I1226 10:14:23.000463 19922 net.cpp:176] Network initialization done.
I1226 10:14:23.000468 19922 net.cpp:177] Memory required for Data 2261238160
I1226 10:14:25.754410 19922 net.cpp:323] Copying source layer data
I1226 10:14:25.754462 19922 net.cpp:323] Copying source layer conv1a
I1226 10:14:25.754602 19922 net.cpp:323] Copying source layer relu1a
I1226 10:14:25.754611 19922 net.cpp:323] Copying source layer pool1
I1226 10:14:25.754616 19922 net.cpp:323] Copying source layer conv2a
I1226 10:14:25.757021 19922 net.cpp:323] Copying source layer relu2a
I1226 10:14:25.757030 19922 net.cpp:323] Copying source layer pool2
I1226 10:14:25.757035 19922 net.cpp:323] Copying source layer conv3a
I1226 10:14:25.766515 19922 net.cpp:323] Copying source layer relu3a
I1226 10:14:25.766531 19922 net.cpp:323] Copying source layer conv3b
I1226 10:14:25.785586 19922 net.cpp:323] Copying source layer relu3b
I1226 10:14:25.785604 19922 net.cpp:323] Copying source layer pool3
I1226 10:14:25.785609 19922 net.cpp:323] Copying source layer conv4a
I1226 10:14:25.820544 19922 net.cpp:323] Copying source layer relu4a
I1226 10:14:25.820579 19922 net.cpp:323] Copying source layer conv4b
I1226 10:14:25.914378 19922 net.cpp:323] Copying source layer relu4b
I1226 10:14:25.914424 19922 net.cpp:323] Copying source layer pool4
I1226 10:14:25.914443 19922 net.cpp:323] Copying source layer conv5a
I1226 10:14:26.044428 19922 net.cpp:323] Copying source layer relu5a
I1226 10:14:26.044479 19922 net.cpp:323] Copying source layer conv5b
I1226 10:14:26.166146 19922 net.cpp:323] Copying source layer relu5b
I1226 10:14:26.166204 19922 net.cpp:323] Copying source layer pool5
I1226 10:14:26.166210 19922 net.cpp:323] Copying source layer fc6-1
I1226 10:14:26.772497 19922 net.cpp:323] Copying source layer relu6
I1226 10:14:26.772542 19922 net.cpp:323] Copying source layer drop6
I1226 10:14:26.772548 19922 net.cpp:323] Copying source layer fc7-1
I1226 10:14:27.074296 19922 net.cpp:323] Copying source layer relu7
I1226 10:14:27.074359 19922 net.cpp:323] Copying source layer drop7
I1226 10:14:27.074367 19922 net.cpp:323] Copying source layer fc8
I1226 10:14:27.074477 19922 net.cpp:320] Ignoring source layer loss
E1226 10:14:27.082305 19922 extract_image_features.cpp:72] Extracting features for 28192 batches
E1226 10:14:28.246733 19922 extract_image_features.cpp:112] Successfully extracted 0 features!
questions:
extract 0 features?why?
thanks.

from scnn.

zhengshou avatar zhengshou commented on September 3, 2024

first of all, you seem to be extracting features instead of training model

as for feature extraction, many mistakes can lead to "extracted 0 features", such as using incorrect prefix file, etc.

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

How to train my model ? thanks.

from scnn.

zhengshou avatar zhengshou commented on September 3, 2024

You would be able to identify the above issue if have experiences on C3D and caffe. So as stated in readme, please first refer to C3D and Caffe for general instructions about how to train 3D CNN model. After that please read our paper carefully for training details customized for scnn. At last read our readme file for some implemental instructions.

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

thanks.

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

How to train my model?
cd scnn/experiments/THUMOS14/network_proposal:
1.prepare my data(list_train_uniform16_proposal.lst)
for example:
dataset/valvideo_validation/diving_01/ 678 1 8
dataset/valvideo_validation/diving_01/ 678 1 16
dataset/valvideo_validation/diving_01/ 678 1 32
dataset/valvideo_validation/diving_01/ 678 1 64
dataset/valvideo_validation/diving_01/ 678 1 128
dataset/valvideo_validation/diving_01/ 678 1 256
dataset/valvideo_validation/diving_01/ 678 1 512
dataset/valvideo_validation/diving_01/ 1111 1 8
dataset/valvideo_validation/diving_01/ 1111 1 16
dataset/valvideo_validation/diving_01/ 1111 1 32
dataset/valvideo_validation/diving_01/ 1111 1 64
dataset/valvideo_validation/diving_01/ 1111 1 128
dataset/valvideo_validation/diving_01/ 1111 1 256
dataset/valvideo_validation/diving_01/ 1111 1 512
dataset/valvideo_validation/diving_01/ 1364 1 8
dataset/valvideo_validation/diving_01/ 1364 1 16
dataset/valvideo_validation/diving_01/ 1364 1 32
dataset/valvideo_validation/diving_01/ 1364 1 64
dataset/valvideo_validation/diving_01/ 1364 1 128
dataset/valvideo_validation/diving_01/ 1364 1 256
dataset/valvideo_validation/diving_01/ 1364 1 512
dataset/valvideo_validation/diving_01/ 1522 1 8
dataset/valvideo_validation/diving_01/ 1522 1 16
dataset/valvideo_validation/diving_01/ 1522 1 32
dataset/valvideo_validation/diving_01/ 1522 1 64
dataset/valvideo_validation/diving_01/ 1522 1 128
dataset/valvideo_validation/diving_01/ 1522 1 256
dataset/valvideo_validation/diving_01/ 1522 1 512

678 ,1111 ,1364, 1522 is the index of the diving action start frame.

2.sh finetuning.sh
Is my idea correct ?

train.log:
I1228 16:09:17.603974 9386 finetune_net.cpp:25] Starting Optimization
I1228 16:09:17.604141 9386 solver.cpp:41] Creating training net.
I1228 16:09:17.605696 9386 net.cpp:65] Memory required for Data0
I1228 16:09:17.605808 9386 net.cpp:76] Creating Layer data
I1228 16:09:17.605831 9386 net.cpp:112] data -> data
I1228 16:09:17.605859 9386 net.cpp:112] data -> label
I1228 16:09:17.605882 9386 video_data_layer.cpp:283] Opening file list_train_uniform16_proposal.lst
I1228 16:09:17.606021 9386 video_data_layer.cpp:313] Shuffling data
I1228 16:09:18.032943 9386 video_data_layer.cpp:318] A total of 28 video chunks.
I1228 16:09:18.033015 9386 video_data_layer.cpp:345] read video from dataset/valvideo_validation/diving_01/
I1228 16:09:18.363059 9386 video_data_layer.cpp:366] output data size: 3,3,16,112,112
I1228 16:09:18.363119 9386 video_data_layer.cpp:388] Loading mean file from../../../models/train01_16_128_171_mean.binaryproto
I1228 16:09:18.402557 9386 video_data_layer.cpp:417] Initializing prefetch
I1228 16:09:18.402652 9386 video_data_layer.cpp:419] Prefetch initialized.
I1228 16:09:18.402736 9386 net.cpp:127] Top shape: 3 3 16 112 112 (1806336)
I1228 16:09:18.402746 9386 net.cpp:127] Top shape: 3 1 1 1 1 (3)
I1228 16:09:18.402751 9386 net.cpp:136] Memory required for Data 7225356
I1228 16:09:18.402761 9386 net.cpp:159] data does not need backward computation.
I1228 16:09:18.402783 9386 net.cpp:76] Creating Layer conv1a
I1228 16:09:18.402792 9386 net.cpp:86] conv1a <- data
I1228 16:09:18.402812 9386 net.cpp:112] conv1a -> conv1a
I1228 16:09:18.404013 9386 net.cpp:127] Top shape: 3 64 16 112 112 (38535168)
I1228 16:09:18.404026 9386 net.cpp:136] Memory required for Data 161366028
I1228 16:09:18.404031 9386 net.cpp:154] conv1a needs backward computation.
I1228 16:09:18.404042 9386 net.cpp:76] Creating Layer relu1a
I1228 16:09:18.404047 9386 net.cpp:86] relu1a <- conv1a
I1228 16:09:18.404058 9386 net.cpp:100] relu1a -> conv1a (in-place)
I1228 16:09:18.404067 9386 net.cpp:127] Top shape: 3 64 16 112 112 (38535168)
I1228 16:09:18.404072 9386 net.cpp:136] Memory required for Data 161366028
I1228 16:09:18.404076 9386 net.cpp:154] relu1a needs backward computation.
I1228 16:09:18.404085 9386 net.cpp:76] Creating Layer pool1
I1228 16:09:18.404089 9386 net.cpp:86] pool1 <- conv1a
I1228 16:09:18.404095 9386 net.cpp:112] pool1 -> pool1
I1228 16:09:18.404120 9386 net.cpp:127] Top shape: 3 64 16 56 56 (9633792)
I1228 16:09:18.404126 9386 net.cpp:136] Memory required for Data 199901196
I1228 16:09:18.404130 9386 net.cpp:154] pool1 needs backward computation.
I1228 16:09:18.404141 9386 net.cpp:76] Creating Layer conv2a
I1228 16:09:18.404146 9386 net.cpp:86] conv2a <- pool1
I1228 16:09:18.404152 9386 net.cpp:112] conv2a -> conv2a
I1228 16:09:18.425338 9386 net.cpp:127] Top shape: 3 128 16 56 56 (19267584)
I1228 16:09:18.425359 9386 net.cpp:136] Memory required for Data 276971532
I1228 16:09:18.425379 9386 net.cpp:154] conv2a needs backward computation.
I1228 16:09:18.425400 9386 net.cpp:76] Creating Layer relu2a
I1228 16:09:18.425408 9386 net.cpp:86] relu2a <- conv2a
I1228 16:09:18.425417 9386 net.cpp:100] relu2a -> conv2a (in-place)
I1228 16:09:18.425424 9386 net.cpp:127] Top shape: 3 128 16 56 56 (19267584)
I1228 16:09:18.425429 9386 net.cpp:136] Memory required for Data 276971532
I1228 16:09:18.425434 9386 net.cpp:154] relu2a needs backward computation.
I1228 16:09:18.425442 9386 net.cpp:76] Creating Layer pool2
I1228 16:09:18.425449 9386 net.cpp:86] pool2 <- conv2a
I1228 16:09:18.425457 9386 net.cpp:112] pool2 -> pool2
I1228 16:09:18.425467 9386 net.cpp:127] Top shape: 3 128 8 28 28 (2408448)
I1228 16:09:18.425472 9386 net.cpp:136] Memory required for Data 286605324
I1228 16:09:18.425477 9386 net.cpp:154] pool2 needs backward computation.
I1228 16:09:18.425487 9386 net.cpp:76] Creating Layer conv3a
I1228 16:09:18.425492 9386 net.cpp:86] conv3a <- pool2
I1228 16:09:18.425498 9386 net.cpp:112] conv3a -> conv3a
I1228 16:09:18.502683 9386 net.cpp:127] Top shape: 3 256 8 28 28 (4816896)
I1228 16:09:18.502710 9386 net.cpp:136] Memory required for Data 305872908
I1228 16:09:18.502729 9386 net.cpp:154] conv3a needs backward computation.
I1228 16:09:18.502753 9386 net.cpp:76] Creating Layer relu3a
I1228 16:09:18.502759 9386 net.cpp:86] relu3a <- conv3a
I1228 16:09:18.502768 9386 net.cpp:100] relu3a -> conv3a (in-place)
I1228 16:09:18.502775 9386 net.cpp:127] Top shape: 3 256 8 28 28 (4816896)
I1228 16:09:18.502780 9386 net.cpp:136] Memory required for Data 305872908
I1228 16:09:18.502784 9386 net.cpp:154] relu3a needs backward computation.
I1228 16:09:18.502792 9386 net.cpp:76] Creating Layer conv3b
I1228 16:09:18.502799 9386 net.cpp:86] conv3b <- conv3a
I1228 16:09:18.502804 9386 net.cpp:112] conv3b -> conv3b
I1228 16:09:18.699749 9386 net.cpp:127] Top shape: 3 256 8 28 28 (4816896)
I1228 16:09:18.699820 9386 net.cpp:136] Memory required for Data 325140492
I1228 16:09:18.699831 9386 net.cpp:154] conv3b needs backward computation.
I1228 16:09:18.699863 9386 net.cpp:76] Creating Layer relu3b
I1228 16:09:18.699880 9386 net.cpp:86] relu3b <- conv3b
I1228 16:09:18.699897 9386 net.cpp:100] relu3b -> conv3b (in-place)
I1228 16:09:18.699905 9386 net.cpp:127] Top shape: 3 256 8 28 28 (4816896)
I1228 16:09:18.699911 9386 net.cpp:136] Memory required for Data 325140492
I1228 16:09:18.699916 9386 net.cpp:154] relu3b needs backward computation.
I1228 16:09:18.699930 9386 net.cpp:76] Creating Layer pool3
I1228 16:09:18.699937 9386 net.cpp:86] pool3 <- conv3b
I1228 16:09:18.699949 9386 net.cpp:112] pool3 -> pool3
I1228 16:09:18.699970 9386 net.cpp:127] Top shape: 3 256 4 14 14 (602112)
I1228 16:09:18.699977 9386 net.cpp:136] Memory required for Data 327548940
I1228 16:09:18.699981 9386 net.cpp:154] pool3 needs backward computation.
I1228 16:09:18.699992 9386 net.cpp:76] Creating Layer conv4a
I1228 16:09:18.699997 9386 net.cpp:86] conv4a <- pool3
I1228 16:09:18.700006 9386 net.cpp:112] conv4a -> conv4a
I1228 16:09:19.069234 9386 net.cpp:127] Top shape: 3 512 4 14 14 (1204224)
I1228 16:09:19.069264 9386 net.cpp:136] Memory required for Data 332365836
I1228 16:09:19.069278 9386 net.cpp:154] conv4a needs backward computation.
I1228 16:09:19.069298 9386 net.cpp:76] Creating Layer relu4a
I1228 16:09:19.069306 9386 net.cpp:86] relu4a <- conv4a
I1228 16:09:19.069319 9386 net.cpp:100] relu4a -> conv4a (in-place)
I1228 16:09:19.069329 9386 net.cpp:127] Top shape: 3 512 4 14 14 (1204224)
I1228 16:09:19.069334 9386 net.cpp:136] Memory required for Data 332365836
I1228 16:09:19.069339 9386 net.cpp:154] relu4a needs backward computation.
I1228 16:09:19.069349 9386 net.cpp:76] Creating Layer conv4b
I1228 16:09:19.069355 9386 net.cpp:86] conv4b <- conv4a
I1228 16:09:19.069361 9386 net.cpp:112] conv4b -> conv4b
I1228 16:09:19.723534 9386 net.cpp:127] Top shape: 3 512 4 14 14 (1204224)
I1228 16:09:19.723599 9386 net.cpp:136] Memory required for Data 337182732
I1228 16:09:19.723611 9386 net.cpp:154] conv4b needs backward computation.
I1228 16:09:19.723631 9386 net.cpp:76] Creating Layer relu4b
I1228 16:09:19.723645 9386 net.cpp:86] relu4b <- conv4b
I1228 16:09:19.723664 9386 net.cpp:100] relu4b -> conv4b (in-place)
I1228 16:09:19.723676 9386 net.cpp:127] Top shape: 3 512 4 14 14 (1204224)
I1228 16:09:19.723685 9386 net.cpp:136] Memory required for Data 337182732
I1228 16:09:19.723691 9386 net.cpp:154] relu4b needs backward computation.
I1228 16:09:19.723716 9386 net.cpp:76] Creating Layer pool4
I1228 16:09:19.723726 9386 net.cpp:86] pool4 <- conv4b
I1228 16:09:19.723737 9386 net.cpp:112] pool4 -> pool4
I1228 16:09:19.723752 9386 net.cpp:127] Top shape: 3 512 2 7 7 (150528)
I1228 16:09:19.723760 9386 net.cpp:136] Memory required for Data 337784844
I1228 16:09:19.723767 9386 net.cpp:154] pool4 needs backward computation.
I1228 16:09:19.723805 9386 net.cpp:76] Creating Layer conv5a
I1228 16:09:19.723816 9386 net.cpp:86] conv5a <- pool4
I1228 16:09:19.723827 9386 net.cpp:112] conv5a -> conv5a
I1228 16:09:20.352849 9386 net.cpp:127] Top shape: 3 512 2 7 7 (150528)
I1228 16:09:20.352903 9386 net.cpp:136] Memory required for Data 338386956
I1228 16:09:20.352919 9386 net.cpp:154] conv5a needs backward computation.
I1228 16:09:20.352933 9386 net.cpp:76] Creating Layer relu5a
I1228 16:09:20.352943 9386 net.cpp:86] relu5a <- conv5a
I1228 16:09:20.352957 9386 net.cpp:100] relu5a -> conv5a (in-place)
I1228 16:09:20.352964 9386 net.cpp:127] Top shape: 3 512 2 7 7 (150528)
I1228 16:09:20.352969 9386 net.cpp:136] Memory required for Data 338386956
I1228 16:09:20.352973 9386 net.cpp:154] relu5a needs backward computation.
I1228 16:09:20.352988 9386 net.cpp:76] Creating Layer conv5b
I1228 16:09:20.352994 9386 net.cpp:86] conv5b <- conv5a
I1228 16:09:20.353000 9386 net.cpp:112] conv5b -> conv5b
I1228 16:09:20.978358 9386 net.cpp:127] Top shape: 3 512 2 7 7 (150528)
I1228 16:09:20.978410 9386 net.cpp:136] Memory required for Data 338989068
I1228 16:09:20.978420 9386 net.cpp:154] conv5b needs backward computation.
I1228 16:09:20.978435 9386 net.cpp:76] Creating Layer relu5b
I1228 16:09:20.978456 9386 net.cpp:86] relu5b <- conv5b
I1228 16:09:20.978471 9386 net.cpp:100] relu5b -> conv5b (in-place)
I1228 16:09:20.978478 9386 net.cpp:127] Top shape: 3 512 2 7 7 (150528)
I1228 16:09:20.978484 9386 net.cpp:136] Memory required for Data 338989068
I1228 16:09:20.978488 9386 net.cpp:154] relu5b needs backward computation.
I1228 16:09:20.978508 9386 net.cpp:76] Creating Layer pool5
I1228 16:09:20.978514 9386 net.cpp:86] pool5 <- conv5b
I1228 16:09:20.978521 9386 net.cpp:112] pool5 -> pool5
I1228 16:09:20.978533 9386 net.cpp:127] Top shape: 3 512 1 4 4 (24576)
I1228 16:09:20.978538 9386 net.cpp:136] Memory required for Data 339087372
I1228 16:09:20.978543 9386 net.cpp:154] pool5 needs backward computation.
I1228 16:09:20.978551 9386 net.cpp:76] Creating Layer fc6-1
I1228 16:09:20.978556 9386 net.cpp:86] fc6-1 <- pool5
I1228 16:09:20.978562 9386 net.cpp:112] fc6-1 -> fc6-1
I1228 16:09:23.927947 9386 net.cpp:127] Top shape: 3 4096 1 1 1 (12288)
I1228 16:09:23.927994 9386 net.cpp:136] Memory required for Data 339136524
I1228 16:09:23.928002 9386 net.cpp:154] fc6-1 needs backward computation.
I1228 16:09:23.928021 9386 net.cpp:76] Creating Layer relu6
I1228 16:09:23.928033 9386 net.cpp:86] relu6 <- fc6-1
I1228 16:09:23.928046 9386 net.cpp:100] relu6 -> fc6-1 (in-place)
I1228 16:09:23.928053 9386 net.cpp:127] Top shape: 3 4096 1 1 1 (12288)
I1228 16:09:23.928058 9386 net.cpp:136] Memory required for Data 339136524
I1228 16:09:23.928062 9386 net.cpp:154] relu6 needs backward computation.
I1228 16:09:23.928069 9386 net.cpp:76] Creating Layer drop6
I1228 16:09:23.928074 9386 net.cpp:86] drop6 <- fc6-1
I1228 16:09:23.928079 9386 net.cpp:100] drop6 -> fc6-1 (in-place)
I1228 16:09:23.928086 9386 net.cpp:127] Top shape: 3 4096 1 1 1 (12288)
I1228 16:09:23.928091 9386 net.cpp:136] Memory required for Data 339136524
I1228 16:09:23.928094 9386 net.cpp:154] drop6 needs backward computation.
I1228 16:09:23.928102 9386 net.cpp:76] Creating Layer fc7-1
I1228 16:09:23.928107 9386 net.cpp:86] fc7-1 <- fc6-1
I1228 16:09:23.928112 9386 net.cpp:112] fc7-1 -> fc7-1
I1228 16:09:25.364635 9386 net.cpp:127] Top shape: 3 4096 1 1 1 (12288)
I1228 16:09:25.364687 9386 net.cpp:136] Memory required for Data 339185676
I1228 16:09:25.364706 9386 net.cpp:154] fc7-1 needs backward computation.
I1228 16:09:25.364722 9386 net.cpp:76] Creating Layer relu7
I1228 16:09:25.364732 9386 net.cpp:86] relu7 <- fc7-1
I1228 16:09:25.364747 9386 net.cpp:100] relu7 -> fc7-1 (in-place)
I1228 16:09:25.364754 9386 net.cpp:127] Top shape: 3 4096 1 1 1 (12288)
I1228 16:09:25.364760 9386 net.cpp:136] Memory required for Data 339185676
I1228 16:09:25.364764 9386 net.cpp:154] relu7 needs backward computation.
I1228 16:09:25.364771 9386 net.cpp:76] Creating Layer drop7
I1228 16:09:25.364778 9386 net.cpp:86] drop7 <- fc7-1
I1228 16:09:25.364783 9386 net.cpp:100] drop7 -> fc7-1 (in-place)
I1228 16:09:25.364789 9386 net.cpp:127] Top shape: 3 4096 1 1 1 (12288)
I1228 16:09:25.364794 9386 net.cpp:136] Memory required for Data 339185676
I1228 16:09:25.364807 9386 net.cpp:154] drop7 needs backward computation.
I1228 16:09:25.364819 9386 net.cpp:76] Creating Layer fc8
I1228 16:09:25.364823 9386 net.cpp:86] fc8 <- fc7-1
I1228 16:09:25.364830 9386 net.cpp:112] fc8 -> fc8
I1228 16:09:25.365612 9386 net.cpp:127] Top shape: 3 2 1 1 1 (6)
I1228 16:09:25.365620 9386 net.cpp:136] Memory required for Data 339185700
I1228 16:09:25.365624 9386 net.cpp:154] fc8 needs backward computation.
I1228 16:09:25.365633 9386 net.cpp:76] Creating Layer loss
I1228 16:09:25.365638 9386 net.cpp:86] loss <- fc8
I1228 16:09:25.365645 9386 net.cpp:86] loss <- label
I1228 16:09:25.365654 9386 net.cpp:136] Memory required for Data 339185700
I1228 16:09:25.365658 9386 net.cpp:154] loss needs backward computation.
I1228 16:09:25.365689 9386 net.cpp:183] Collecting Learning Rate and Weight Decay.
I1228 16:09:25.365720 9386 net.cpp:176] Network initialization done.
I1228 16:09:25.365725 9386 net.cpp:177] Memory required for Data 339185700
I1228 16:09:25.365856 9386 solver.cpp:44] Creating testing net.
I1228 16:09:25.366689 9386 net.cpp:65] Memory required for Data0
I1228 16:09:25.366737 9386 net.cpp:76] Creating Layer data
I1228 16:09:25.366744 9386 net.cpp:112] data -> data
I1228 16:09:25.366755 9386 net.cpp:112] data -> label
I1228 16:09:25.366763 9386 video_data_layer.cpp:283] Opening file list_train_uniform16_proposal.lst
I1228 16:09:25.366806 9386 video_data_layer.cpp:318] A total of 28 video chunks.
I1228 16:09:25.366811 9386 video_data_layer.cpp:345] read video from dataset/valvideo_validation/diving_01/
I1228 16:09:25.381356 9386 video_data_layer.cpp:366] output data size: 1,3,16,112,112
I1228 16:09:25.381387 9386 video_data_layer.cpp:388] Loading mean file from../../../models/train01_16_128_171_mean.binaryproto
I1228 16:09:25.422559 9386 video_data_layer.cpp:417] Initializing prefetch
I1228 16:09:25.422665 9386 video_data_layer.cpp:419] Prefetch initialized.
I1228 16:09:25.422682 9386 net.cpp:127] Top shape: 1 3 16 112 112 (602112)
I1228 16:09:25.422686 9386 net.cpp:127] Top shape: 1 1 1 1 1 (1)
I1228 16:09:25.422691 9386 net.cpp:136] Memory required for Data 2408452
I1228 16:09:25.422700 9386 net.cpp:159] data does not need backward computation.
I1228 16:09:25.422722 9386 net.cpp:76] Creating Layer conv1a
I1228 16:09:25.422731 9386 net.cpp:86] conv1a <- data
I1228 16:09:25.422744 9386 net.cpp:112] conv1a -> conv1a
I1228 16:09:25.423883 9386 net.cpp:127] Top shape: 1 64 16 112 112 (12845056)
I1228 16:09:25.423893 9386 net.cpp:136] Memory required for Data 53788676
I1228 16:09:25.423898 9386 net.cpp:154] conv1a needs backward computation.
I1228 16:09:25.423907 9386 net.cpp:76] Creating Layer relu1a
I1228 16:09:25.423913 9386 net.cpp:86] relu1a <- conv1a
I1228 16:09:25.423919 9386 net.cpp:100] relu1a -> conv1a (in-place)
I1228 16:09:25.423926 9386 net.cpp:127] Top shape: 1 64 16 112 112 (12845056)
I1228 16:09:25.423931 9386 net.cpp:136] Memory required for Data 53788676
I1228 16:09:25.423935 9386 net.cpp:154] relu1a needs backward computation.
I1228 16:09:25.423943 9386 net.cpp:76] Creating Layer pool1
I1228 16:09:25.423949 9386 net.cpp:86] pool1 <- conv1a
I1228 16:09:25.423954 9386 net.cpp:112] pool1 -> pool1
I1228 16:09:25.423964 9386 net.cpp:127] Top shape: 1 64 16 56 56 (3211264)
I1228 16:09:25.423969 9386 net.cpp:136] Memory required for Data 66633732
I1228 16:09:25.423972 9386 net.cpp:154] pool1 needs backward computation.
I1228 16:09:25.423981 9386 net.cpp:76] Creating Layer conv2a
I1228 16:09:25.423986 9386 net.cpp:86] conv2a <- pool1
I1228 16:09:25.423993 9386 net.cpp:112] conv2a -> conv2a
I1228 16:09:25.446898 9386 net.cpp:127] Top shape: 1 128 16 56 56 (6422528)
I1228 16:09:25.446949 9386 net.cpp:136] Memory required for Data 92323844
I1228 16:09:25.446957 9386 net.cpp:154] conv2a needs backward computation.
I1228 16:09:25.446971 9386 net.cpp:76] Creating Layer relu2a
I1228 16:09:25.446981 9386 net.cpp:86] relu2a <- conv2a
I1228 16:09:25.446995 9386 net.cpp:100] relu2a -> conv2a (in-place)
I1228 16:09:25.447012 9386 net.cpp:127] Top shape: 1 128 16 56 56 (6422528)
I1228 16:09:25.447018 9386 net.cpp:136] Memory required for Data 92323844
I1228 16:09:25.447022 9386 net.cpp:154] relu2a needs backward computation.
I1228 16:09:25.447031 9386 net.cpp:76] Creating Layer pool2
I1228 16:09:25.447036 9386 net.cpp:86] pool2 <- conv2a
I1228 16:09:25.447042 9386 net.cpp:112] pool2 -> pool2
I1228 16:09:25.447052 9386 net.cpp:127] Top shape: 1 128 8 28 28 (802816)
I1228 16:09:25.447057 9386 net.cpp:136] Memory required for Data 95535108
I1228 16:09:25.447062 9386 net.cpp:154] pool2 needs backward computation.
I1228 16:09:25.447072 9386 net.cpp:76] Creating Layer conv3a
I1228 16:09:25.447077 9386 net.cpp:86] conv3a <- pool2
I1228 16:09:25.447083 9386 net.cpp:112] conv3a -> conv3a
I1228 16:09:25.525430 9386 net.cpp:127] Top shape: 1 256 8 28 28 (1605632)
I1228 16:09:25.525447 9386 net.cpp:136] Memory required for Data 101957636
I1228 16:09:25.525465 9386 net.cpp:154] conv3a needs backward computation.
I1228 16:09:25.525487 9386 net.cpp:76] Creating Layer relu3a
I1228 16:09:25.525493 9386 net.cpp:86] relu3a <- conv3a
I1228 16:09:25.525502 9386 net.cpp:100] relu3a -> conv3a (in-place)
I1228 16:09:25.525509 9386 net.cpp:127] Top shape: 1 256 8 28 28 (1605632)
I1228 16:09:25.525514 9386 net.cpp:136] Memory required for Data 101957636
I1228 16:09:25.525519 9386 net.cpp:154] relu3a needs backward computation.
I1228 16:09:25.525527 9386 net.cpp:76] Creating Layer conv3b
I1228 16:09:25.525532 9386 net.cpp:86] conv3b <- conv3a
I1228 16:09:25.525538 9386 net.cpp:112] conv3b -> conv3b
I1228 16:09:25.684691 9386 net.cpp:127] Top shape: 1 256 8 28 28 (1605632)
I1228 16:09:25.684712 9386 net.cpp:136] Memory required for Data 108380164
I1228 16:09:25.684729 9386 net.cpp:154] conv3b needs backward computation.
I1228 16:09:25.684738 9386 net.cpp:76] Creating Layer relu3b
I1228 16:09:25.684746 9386 net.cpp:86] relu3b <- conv3b
I1228 16:09:25.684756 9386 net.cpp:100] relu3b -> conv3b (in-place)
I1228 16:09:25.684762 9386 net.cpp:127] Top shape: 1 256 8 28 28 (1605632)
I1228 16:09:25.684767 9386 net.cpp:136] Memory required for Data 108380164
I1228 16:09:25.684772 9386 net.cpp:154] relu3b needs backward computation.
I1228 16:09:25.684779 9386 net.cpp:76] Creating Layer pool3
I1228 16:09:25.684783 9386 net.cpp:86] pool3 <- conv3b
I1228 16:09:25.684789 9386 net.cpp:112] pool3 -> pool3
I1228 16:09:25.684801 9386 net.cpp:127] Top shape: 1 256 4 14 14 (200704)
I1228 16:09:25.684806 9386 net.cpp:136] Memory required for Data 109182980
I1228 16:09:25.684810 9386 net.cpp:154] pool3 needs backward computation.
I1228 16:09:25.684818 9386 net.cpp:76] Creating Layer conv4a
I1228 16:09:25.684823 9386 net.cpp:86] conv4a <- pool3
I1228 16:09:25.684829 9386 net.cpp:112] conv4a -> conv4a
I1228 16:09:26.007707 9386 net.cpp:127] Top shape: 1 512 4 14 14 (401408)
I1228 16:09:26.007747 9386 net.cpp:136] Memory required for Data 110788612
I1228 16:09:26.007755 9386 net.cpp:154] conv4a needs backward computation.
I1228 16:09:26.007768 9386 net.cpp:76] Creating Layer relu4a
I1228 16:09:26.007782 9386 net.cpp:86] relu4a <- conv4a
I1228 16:09:26.007797 9386 net.cpp:100] relu4a -> conv4a (in-place)
I1228 16:09:26.007804 9386 net.cpp:127] Top shape: 1 512 4 14 14 (401408)
I1228 16:09:26.007808 9386 net.cpp:136] Memory required for Data 110788612
I1228 16:09:26.007813 9386 net.cpp:154] relu4a needs backward computation.
I1228 16:09:26.007823 9386 net.cpp:76] Creating Layer conv4b
I1228 16:09:26.007828 9386 net.cpp:86] conv4b <- conv4a
I1228 16:09:26.007835 9386 net.cpp:112] conv4b -> conv4b
I1228 16:09:26.642273 9386 net.cpp:127] Top shape: 1 512 4 14 14 (401408)
I1228 16:09:26.642313 9386 net.cpp:136] Memory required for Data 112394244
I1228 16:09:26.642319 9386 net.cpp:154] conv4b needs backward computation.
I1228 16:09:26.642345 9386 net.cpp:76] Creating Layer relu4b
I1228 16:09:26.642354 9386 net.cpp:86] relu4b <- conv4b
I1228 16:09:26.642369 9386 net.cpp:100] relu4b -> conv4b (in-place)
I1228 16:09:26.642385 9386 net.cpp:127] Top shape: 1 512 4 14 14 (401408)
I1228 16:09:26.642391 9386 net.cpp:136] Memory required for Data 112394244
I1228 16:09:26.642395 9386 net.cpp:154] relu4b needs backward computation.
I1228 16:09:26.642410 9386 net.cpp:76] Creating Layer pool4
I1228 16:09:26.642416 9386 net.cpp:86] pool4 <- conv4b
I1228 16:09:26.642421 9386 net.cpp:112] pool4 -> pool4
I1228 16:09:26.642431 9386 net.cpp:127] Top shape: 1 512 2 7 7 (50176)
I1228 16:09:26.642436 9386 net.cpp:136] Memory required for Data 112594948
I1228 16:09:26.642441 9386 net.cpp:154] pool4 needs backward computation.
I1228 16:09:26.642449 9386 net.cpp:76] Creating Layer conv5a
I1228 16:09:26.642455 9386 net.cpp:86] conv5a <- pool4
I1228 16:09:26.642462 9386 net.cpp:112] conv5a -> conv5a
I1228 16:09:27.264855 9386 net.cpp:127] Top shape: 1 512 2 7 7 (50176)
I1228 16:09:27.264904 9386 net.cpp:136] Memory required for Data 112795652
I1228 16:09:27.264912 9386 net.cpp:154] conv5a needs backward computation.
I1228 16:09:27.264926 9386 net.cpp:76] Creating Layer relu5a
I1228 16:09:27.264935 9386 net.cpp:86] relu5a <- conv5a
I1228 16:09:27.264950 9386 net.cpp:100] relu5a -> conv5a (in-place)
I1228 16:09:27.264958 9386 net.cpp:127] Top shape: 1 512 2 7 7 (50176)
I1228 16:09:27.264963 9386 net.cpp:136] Memory required for Data 112795652
I1228 16:09:27.264967 9386 net.cpp:154] relu5a needs backward computation.
I1228 16:09:27.264978 9386 net.cpp:76] Creating Layer conv5b
I1228 16:09:27.264983 9386 net.cpp:86] conv5b <- conv5a
I1228 16:09:27.264989 9386 net.cpp:112] conv5b -> conv5b
I1228 16:09:27.869312 9386 net.cpp:127] Top shape: 1 512 2 7 7 (50176)
I1228 16:09:27.869352 9386 net.cpp:136] Memory required for Data 112996356
I1228 16:09:27.869371 9386 net.cpp:154] conv5b needs backward computation.
I1228 16:09:27.869390 9386 net.cpp:76] Creating Layer relu5b
I1228 16:09:27.869400 9386 net.cpp:86] relu5b <- conv5b
I1228 16:09:27.869413 9386 net.cpp:100] relu5b -> conv5b (in-place)
I1228 16:09:27.869421 9386 net.cpp:127] Top shape: 1 512 2 7 7 (50176)
I1228 16:09:27.869426 9386 net.cpp:136] Memory required for Data 112996356
I1228 16:09:27.869431 9386 net.cpp:154] relu5b needs backward computation.
I1228 16:09:27.869439 9386 net.cpp:76] Creating Layer pool5
I1228 16:09:27.869444 9386 net.cpp:86] pool5 <- conv5b
I1228 16:09:27.869452 9386 net.cpp:112] pool5 -> pool5
I1228 16:09:27.869462 9386 net.cpp:127] Top shape: 1 512 1 4 4 (8192)
I1228 16:09:27.869467 9386 net.cpp:136] Memory required for Data 113029124
I1228 16:09:27.869472 9386 net.cpp:154] pool5 needs backward computation.
I1228 16:09:27.869482 9386 net.cpp:76] Creating Layer fc6-1
I1228 16:09:27.869488 9386 net.cpp:86] fc6-1 <- pool5
I1228 16:09:27.869493 9386 net.cpp:112] fc6-1 -> fc6-1
I1228 16:09:30.794142 9386 net.cpp:127] Top shape: 1 4096 1 1 1 (4096)
I1228 16:09:30.794188 9386 net.cpp:136] Memory required for Data 113045508
I1228 16:09:30.794205 9386 net.cpp:154] fc6-1 needs backward computation.
I1228 16:09:30.794224 9386 net.cpp:76] Creating Layer relu6
I1228 16:09:30.794232 9386 net.cpp:86] relu6 <- fc6-1
I1228 16:09:30.794247 9386 net.cpp:100] relu6 -> fc6-1 (in-place)
I1228 16:09:30.794255 9386 net.cpp:127] Top shape: 1 4096 1 1 1 (4096)
I1228 16:09:30.794260 9386 net.cpp:136] Memory required for Data 113045508
I1228 16:09:30.794265 9386 net.cpp:154] relu6 needs backward computation.
I1228 16:09:30.794272 9386 net.cpp:76] Creating Layer drop6
I1228 16:09:30.794277 9386 net.cpp:86] drop6 <- fc6-1
I1228 16:09:30.794283 9386 net.cpp:100] drop6 -> fc6-1 (in-place)
I1228 16:09:30.794291 9386 net.cpp:127] Top shape: 1 4096 1 1 1 (4096)
I1228 16:09:30.794296 9386 net.cpp:136] Memory required for Data 113045508
I1228 16:09:30.794312 9386 net.cpp:154] drop6 needs backward computation.
I1228 16:09:30.794322 9386 net.cpp:76] Creating Layer fc7-1
I1228 16:09:30.794327 9386 net.cpp:86] fc7-1 <- fc6-1
I1228 16:09:30.794332 9386 net.cpp:112] fc7-1 -> fc7-1
I1228 16:09:32.252893 9386 net.cpp:127] Top shape: 1 4096 1 1 1 (4096)
I1228 16:09:32.252948 9386 net.cpp:136] Memory required for Data 113061892
I1228 16:09:32.252957 9386 net.cpp:154] fc7-1 needs backward computation.
I1228 16:09:32.252974 9386 net.cpp:76] Creating Layer relu7
I1228 16:09:32.252982 9386 net.cpp:86] relu7 <- fc7-1
I1228 16:09:32.252995 9386 net.cpp:100] relu7 -> fc7-1 (in-place)
I1228 16:09:32.253003 9386 net.cpp:127] Top shape: 1 4096 1 1 1 (4096)
I1228 16:09:32.253008 9386 net.cpp:136] Memory required for Data 113061892
I1228 16:09:32.253012 9386 net.cpp:154] relu7 needs backward computation.
I1228 16:09:32.253021 9386 net.cpp:76] Creating Layer drop7
I1228 16:09:32.253026 9386 net.cpp:86] drop7 <- fc7-1
I1228 16:09:32.253031 9386 net.cpp:100] drop7 -> fc7-1 (in-place)
I1228 16:09:32.253038 9386 net.cpp:127] Top shape: 1 4096 1 1 1 (4096)
I1228 16:09:32.253043 9386 net.cpp:136] Memory required for Data 113061892
I1228 16:09:32.253047 9386 net.cpp:154] drop7 needs backward computation.
I1228 16:09:32.253056 9386 net.cpp:76] Creating Layer fc8
I1228 16:09:32.253060 9386 net.cpp:86] fc8 <- fc7-1
I1228 16:09:32.253067 9386 net.cpp:112] fc8 -> fc8
I1228 16:09:32.253885 9386 net.cpp:127] Top shape: 1 2 1 1 1 (2)
I1228 16:09:32.253895 9386 net.cpp:136] Memory required for Data 113061900
I1228 16:09:32.253898 9386 net.cpp:154] fc8 needs backward computation.
I1228 16:09:32.253906 9386 net.cpp:76] Creating Layer prob
I1228 16:09:32.253911 9386 net.cpp:86] prob <- fc8
I1228 16:09:32.253918 9386 net.cpp:112] prob -> prob
I1228 16:09:32.253928 9386 net.cpp:127] Top shape: 1 2 1 1 1 (2)
I1228 16:09:32.253934 9386 net.cpp:136] Memory required for Data 113061908
I1228 16:09:32.253938 9386 net.cpp:154] prob needs backward computation.
I1228 16:09:32.253944 9386 net.cpp:76] Creating Layer accuracy
I1228 16:09:32.253949 9386 net.cpp:86] accuracy <- prob
I1228 16:09:32.253957 9386 net.cpp:86] accuracy <- label
I1228 16:09:32.253963 9386 net.cpp:112] accuracy -> accuracy
I1228 16:09:32.253971 9386 net.cpp:127] Top shape: 1 2 1 1 1 (2)
I1228 16:09:32.253976 9386 net.cpp:136] Memory required for Data 113061916
I1228 16:09:32.253980 9386 net.cpp:154] accuracy needs backward computation.
I1228 16:09:32.253984 9386 net.cpp:165] This network produces output accuracy
I1228 16:09:32.254019 9386 net.cpp:183] Collecting Learning Rate and Weight Decay.
I1228 16:09:32.254050 9386 net.cpp:176] Network initialization done.
I1228 16:09:32.254055 9386 net.cpp:177] Memory required for Data 113061916
I1228 16:09:32.254132 9386 solver.cpp:49] Solver scaffolding done.
I1228 16:09:32.254138 9386 finetune_net.cpp:27] Loading from ../../../models/conv3d_deepnetA_sport1m_iter_1900000
I1228 16:09:34.110801 9386 net.cpp:323] Copying source layer data
I1228 16:09:34.110848 9386 net.cpp:323] Copying source layer conv1a
I1228 16:09:34.110944 9386 net.cpp:323] Copying source layer relu1a
I1228 16:09:34.110951 9386 net.cpp:323] Copying source layer pool1
I1228 16:09:34.110955 9386 net.cpp:323] Copying source layer conv2a
I1228 16:09:34.113214 9386 net.cpp:323] Copying source layer relu2a
I1228 16:09:34.113222 9386 net.cpp:323] Copying source layer pool2
I1228 16:09:34.113226 9386 net.cpp:323] Copying source layer conv3a
I1228 16:09:34.122073 9386 net.cpp:323] Copying source layer relu3a
I1228 16:09:34.122084 9386 net.cpp:323] Copying source layer conv3b
I1228 16:09:34.139777 9386 net.cpp:323] Copying source layer relu3b
I1228 16:09:34.139788 9386 net.cpp:323] Copying source layer pool3
I1228 16:09:34.139806 9386 net.cpp:323] Copying source layer conv4a
I1228 16:09:34.174645 9386 net.cpp:323] Copying source layer relu4a
I1228 16:09:34.174677 9386 net.cpp:323] Copying source layer conv4b
I1228 16:09:34.247299 9386 net.cpp:323] Copying source layer relu4b
I1228 16:09:34.247342 9386 net.cpp:323] Copying source layer pool4
I1228 16:09:34.247359 9386 net.cpp:323] Copying source layer conv5a
I1228 16:09:34.320765 9386 net.cpp:323] Copying source layer relu5a
I1228 16:09:34.320809 9386 net.cpp:323] Copying source layer conv5b
I1228 16:09:34.394266 9386 net.cpp:323] Copying source layer relu5b
I1228 16:09:34.394317 9386 net.cpp:323] Copying source layer pool5
I1228 16:09:34.394335 9386 net.cpp:323] Copying source layer fc6-1
I1228 16:09:34.752619 9386 net.cpp:323] Copying source layer relu6
I1228 16:09:34.752671 9386 net.cpp:323] Copying source layer drop6
I1228 16:09:34.752691 9386 net.cpp:323] Copying source layer fc7-1
I1228 16:09:34.931928 9386 net.cpp:323] Copying source layer relu7
I1228 16:09:34.931980 9386 net.cpp:323] Copying source layer drop7
I1228 16:09:34.931998 9386 net.cpp:320] Ignoring source layer fc8-1
I1228 16:09:34.932003 9386 net.cpp:323] Copying source layer loss
I1228 16:09:34.933172 9386 solver.cpp:61] Solving DeepConv3DNet_ucf101_finetuning
I1228 16:09:34.933200 9386 solver.cpp:106] Iteration 0, Testing net
I1228 16:09:34.933207 9386 net.cpp:291] Copying source layer data
I1228 16:09:34.933212 9386 net.cpp:291] Copying source layer conv1a
I1228 16:09:34.933221 9386 net.cpp:291] Copying source layer relu1a
I1228 16:09:34.933225 9386 net.cpp:291] Copying source layer pool1
I1228 16:09:34.933229 9386 net.cpp:291] Copying source layer conv2a
I1228 16:09:34.933236 9386 net.cpp:291] Copying source layer relu2a
I1228 16:09:34.933240 9386 net.cpp:291] Copying source layer pool2
I1228 16:09:34.933244 9386 net.cpp:291] Copying source layer conv3a
I1228 16:09:34.933253 9386 net.cpp:291] Copying source layer relu3a
I1228 16:09:34.933256 9386 net.cpp:291] Copying source layer conv3b
I1228 16:09:34.933261 9386 net.cpp:291] Copying source layer relu3b
I1228 16:09:34.933265 9386 net.cpp:291] Copying source layer pool3
I1228 16:09:34.933270 9386 net.cpp:291] Copying source layer conv4a
I1228 16:09:34.933387 9386 net.cpp:291] Copying source layer relu4a
I1228 16:09:34.933395 9386 net.cpp:291] Copying source layer conv4b
I1228 16:09:34.933607 9386 net.cpp:291] Copying source layer relu4b
I1228 16:09:34.933615 9386 net.cpp:291] Copying source layer pool4
I1228 16:09:34.933620 9386 net.cpp:291] Copying source layer conv5a
I1228 16:09:34.933737 9386 net.cpp:291] Copying source layer relu5a
I1228 16:09:34.933745 9386 net.cpp:291] Copying source layer conv5b
I1228 16:09:34.933969 9386 net.cpp:291] Copying source layer relu5b
I1228 16:09:34.933979 9386 net.cpp:291] Copying source layer pool5
I1228 16:09:34.933984 9386 net.cpp:291] Copying source layer fc6-1
I1228 16:09:34.934346 9386 net.cpp:291] Copying source layer relu6
I1228 16:09:34.934355 9386 net.cpp:291] Copying source layer drop6
I1228 16:09:34.934360 9386 net.cpp:291] Copying source layer fc7-1
I1228 16:09:34.934592 9386 net.cpp:291] Copying source layer relu7
I1228 16:09:34.934600 9386 net.cpp:291] Copying source layer drop7
I1228 16:09:34.934605 9386 net.cpp:291] Copying source layer fc8
I1228 16:09:34.934612 9386 net.cpp:288] Ignoring source layer loss
F1228 16:09:58.454560 9528 video_data_layer.cpp:123] Check failed: read_status Testing must not miss any example
*** Check failure stack trace: ***
@ 0x7f71fa268daa (unknown)
@ 0x7f71fa268ce4 (unknown)
@ 0x7f71fa2686e6 (unknown)
@ 0x7f71fa26b687 (unknown)
@ 0x50ac8c caffe::VideoDataLayerPrefetch<>()
@ 0x7f71fa498184 start_thread
@ 0x7f71f656937d (unknown)
@ (nil) (unknown)
Aborted (core dumped)

why?

thanks.

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

Can you help me with my problem?thanks very much.

from scnn.

sophiazy avatar sophiazy commented on September 3, 2024

@Kerry0123
Hi:
When you training your own model,have you generate the correct "list_test_uniform16_proposal.lst " and "
list_train_uniform16_proposal.lst"? I am confused how to generate the above files.Can you help me with my problem?thanks very much.

from scnn.

zhengshou avatar zhengshou commented on September 3, 2024

you can find samples under scnn/experiments/THUMOS14/

from scnn.

Kerry0123 avatar Kerry0123 commented on September 3, 2024

I have a question:
in sample.list_train_uniform16_proposal file:
/dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 2561 1 8
/dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 2593 1 8
/dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 1985 1 16
/dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 2049 1 16
/dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 4161 1 16
/dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 4225 1 16
What's with 2561,2593,2593?how to define with my train set ?
What's with 8,16?how to define with my train set ?
in sample.list_test_uniform16_proposal file:
/dataset/THUMOS14/test/test_frm_all/video_test_0000004/ 1 0 1
/dataset/THUMOS14/test/test_frm_all/video_test_0000004/ 5 0 1
/dataset/THUMOS14/test/test_frm_all/video_test_0000004/ 9 0 1
/dataset/THUMOS14/test/test_frm_all/video_test_0000004/ 13 0 1
/dataset/THUMOS14/test/test_frm_all/video_test_0000004/ 17 0 1
/dataset/THUMOS14/test/test_frm_all/video_test_0000004/ 21 0 1
What's with 1,5,9,13?how to define with my test set ?
thanks very much.

from scnn.

zhengshou avatar zhengshou commented on September 3, 2024

This is very similar to C3D. Please get familiar with C3D first.

After that, you can find and understand answers in our readme file:

We also provide sample input data file to illustrate input data file list format, which is slightly different from C3D:
still, each row corresponds to one input segment
C3D_sample_rate (used for proposal and classification network):
format: video_frame_directory start_frame_index class_label stepsize
stepsize: used for adjusting the window length. measure the step between two consecutive frames in one segment. the frame index of the current frame + stepsize = the frame index of the subsequent frame. note that each segment consists of 16 frames in total.
example: /dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 2561 3 8
C3D_overlap_loss (used for localization network):
format: video_frame_directory start_frame_index class_label stepsize overlap
overlap: the overlap measured by IoU between the candidate segment and the corresponding ground truth segment
example: /dataset/THUMOS14/val/validation_frm_all/video_validation_0000051/ 2561 3 8 0.70701

from scnn.

sophiazy avatar sophiazy commented on September 3, 2024

@zhengshou
As for the trimmed video in the UCF101 dataset,Whether it need to use the slide windows [16,32,64,128,256,512] to trim the UCF101 video into segment?

from scnn.

zhengshou avatar zhengshou commented on September 3, 2024

No. As we mentioned in the paper, we uniformly sample 16 frames out of each trimmed video.

from scnn.

sophiazy avatar sophiazy commented on September 3, 2024

@zhengshou thanks,
in this case, Whether it is that " one segment of 16 frames from a trimmed video is one sample in the training data ,instead of one trimmed video is one sample in the training data"?

from scnn.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.