All convolutions inside of a dense block are ReLU-activated and use batch normalization. Channel-wise concatenation is just doable if the peak and width Proportions of the data stay unchanged, so convolutions in a very dense block are all of stride one. Pooling layers are inserted between dense blocks for https://financefeeds.com/mitrade-introduces-million-dollar-insurance-to-bolster-trader-confidence-in-australia/