我有一个3D数组,尺寸为(1883,100,68),分别代表(批次,步骤,特征)。
这68个特征完全不同,包括能量和MFCC等。
我希望对这些特征按各自类型进行归一化处理。
scaler = StandardScaler()X_train = scaler.fit_transform(X_train.reshape(X_train.shape[0], -1)).reshape(X_train.shape)X_test = scaler.transform(X_test.reshape(X_test.shape[0], -1)).reshape(X_test.shape)print(X_train.shape)print(max(X_train[0][0]))print(min(X_train[0][0]))
显然,将其转换为2D数组是行不通的,因为每个特征会根据所有6800个特征进行归一化。这导致了所有100个步骤中的多个特征都变成了零。
我想要的例子是,第0个特征是能量。对于一个批次,由于有100个步骤,因此有100个能量值。我希望这100个能量值在它们之间进行归一化处理。
所以归一化应该在[1,1,0],[1,2,0],[1,3,0]…[1,100,0]之间进行。其他所有特征也应如此处理。
我应该如何处理这个问题呢?
更新:
以下代码是在sai的帮助下生成的。
def feature_normalization(x): batches_unrolled = np.expand_dims(np.reshape(x, (-1, x.shape[2])), axis=0) x_normalized = (x - np.mean(batches_unrolled, axis=1, keepdims=True)) / np.std(batches_unrolled, axis=1, keepdims=True) np.testing.assert_allclose(x_normalized[0, :, 0], (x[0, :, 0] - np.mean(x[:, :, 0])) / np.std(x[:, :, 0])) return x_normalizeddef testset_normalization(X_train,X_test): batches_unrolled = np.expand_dims(np.reshape(X_train, (-1, x.shape[2])), axis=0) fitted_mean = np.mean(batches_unrolled, axis=1, keepdims=True) fitted_std = np.std(batches_unrolled, axis=1, keepdims=True) X_test_normalized = (X_test - fitted_mean) / fitted_std return X_test_normalized
回答:
要在批次中的所有样本上独立地对特征进行归一化处理,请按以下步骤操作:
- 展开批次样本,得到[10(时间步骤)*批次大小] x [40特征]的矩阵
- 获取每个特征的均值和标准差
- 对实际的批次样本进行逐元素归一化处理
import numpy as npx = np.random.random((20, 10, 40))batches_unrolled = np.expand_dims(np.reshape(x, (-1, 40)), axis=0)x_normalized = (x - np.mean(batches_unrolled, axis=1, keepdims=True)) / np.std(batches_unrolled, axis=1, keepdims=True)np.testing.assert_allclose(x_normalized[0, :, 0], (x[0, :, 0] - np.mean(x[:, :, 0])) / np.std(x[:, :, 0]))