We propose FedWon, a novel method that eliminates normalizations in FL and utilizes scaled weight standardization for multi-domain federated learning. Experimental results on four datasets and models show that FedWon outperforms FedAvg and the state-of-the-art method (FedBN), with improvements exceeding 10% in certain domains. FedWon is versatile for both cross-silo and cross-device FL, even with a small batch size of 1, catering to resource-constrained devices. It also effectively addresses the challenge of skewed label distribution.