All convolutions in a dense block are ReLU-activated and use batch normalization. Channel-sensible concatenation is just achievable if the height and width dimensions of the info continue being unchanged, so convolutions in a dense block are all of stride one. Pooling levels are inserted between dense blocks for further https://financefeeds.com/tradingview-introduces-seasonality-charts/