반응형
본 실습은 크게 3가지 데이터 셋을 활용했습니다.
1. Iris Dataset
2. MNIST
3. MNIST Fashion
Iris Classification
1. Load dataset
from sklearn.datasets import load_iris
iris = load_iris()
X = iris.data
y = iris.target
2. OnehotEncoding
# One hot encoding
from sklearn.preprocessing import OneHotEncoder
enc = OneHotEncoder(sparse=False, handle_unknown='ignore')
enc.fit(y.reshape(len(y), 1))
y_onehot = enc.transform(y.reshape(len(y), 1))
3. Train / Test Split
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y_onehot,
test_size=0.2,
random_state=13)
4. Model Design
model = tf.keras.Sequential([
tf.keras.layers.Dense(32, activation='relu', input_shape=(4,)),
tf.keras.layers.Dense(32, activation='relu'),
tf.keras.layers.Dense(32, activation='relu'),
tf.keras.layers.Dense(3, activation='softmax')
])
model.summary()
5. Compile & Fit
# Compile - adma, cc, acc
model.compile(optimizer='adam', loss='categorical_crossentropy',
metrics=['accuracy'])
# fit - hist, epochs=100
hist = model.fit(X_train, y_train, epochs=100)
6. Evaluate and Visualize
# Evaluate
model.evaluate(X_test, y_test, verbose=2)
# Vis
import matplotlib.pyplot as plt
%matplotlib inline
plt.plot(hist.history['loss'])
plt.plot(hist.history['accuracy'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epochs')
plt.show()
MNIST Classification
1. Load data and split
# Load
import tensorflow as tf
mnist = tf.keras.datasets.mnist
# Split and scaling
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
2. Modeling and compile
# Design - 3 layers
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28,28)),
tf.keras.layers.Dense(1000, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
# Compile - adma, scc, acc
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
4. Model Fitting
import time
start_time = time.time()
hist = model.fit(x_train, y_train, validation_data=(x_test, y_test),
epochs=10, batch_size=100, verbose=1)
print('Fit time: ', time.time() - start_time)
5. Evaluate and Visualize
# Evaluate
score = model.evaluate(x_test, y_test)
print('Test Loss: ', score[0])
print('Test Accuracy: ', score[1])
# Vis
import matplotlib.pyplot as plt
%matplotlib inline
plot_target = ['loss', 'val_loss', 'accuracy', 'val_accuracy']
plt.figure(figsize=(12, 8))
for each in plot_target:
plt.plot(hist.history[each], label=each)
plt.legend()
plt.grid()
plt.show()
6. Wrong data Visualize
import numpy as np
predicted_result = model.predict(x_test)
predicted_labels = np.argmax(predicted_result, axis=1)
wrong_result = []
for n in range(0, len(y_test)):
if predicted_labels[n] != y_test[n]:
wrong_result.append(n)
import random
samples = random.choices(population=wrong_result, k=16)
plt.figure(figsize=(14, 12))
for idx, n in enumerate(samples):
plt.subplot(4, 4, idx+1)
plt.imshow(x_test[n].reshape(28,28), cmap='Greys', interpolation='nearest')
plt.title('Label: ' + str(y_test[n]) + ' / Predict: ' + str(predicted_labels[n]))
plt.axis('off')
plt.show()
MNIST FASION Classification
1. Load data and split
# Load
import tensorflow as tf
fashion_mnist = tf.keras.datasets.fashion_mnist
# Split and scaling
(X_train, y_train), (X_test, y_test) = fashion_mnist.load_data()
X_train, X_test = X_train / 255.0, X_test / 255.0
2. Check data
import random
import matplotlib.pyplot as plt
%matplotlib inline
samples = random.choices(population=range(0, len(y_train)), k=16)
plt.figure(figsize=(14,12))
for idx, n in enumerate(samples):
plt.subplot(4, 4, idx+1)
plt.imshow(X_train[n].reshape(28, 28), cmap='Greys', interpolation='nearest')
plt.title('Label : ' + str(y_train[n]))
plt.axis('off')
plt.show()
3. Modeling and compile
model = tf.keras.models.Sequential([
tf.keras.layers.Flatten(input_shape=(28, 28)),
tf.keras.layers.Dense(1000, activation='relu'),
tf.keras.layers.Dense(10, activation='softmax')
])
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
4. Model Fitting
import time
start_time = time.time()
hist = model.fit(X_train, y_train, validation_data=(X_test, y_test),
epochs=10, batch_size=100, verbose=1)
print('Fit time : ', time.time() - start_time)
5. Evaluate and Visualize
# Evaluate
score = model.evaluate(X_test, y_test)
print('Test Loss : ', score[0])
print('Test Accuracy : ', score[1])
# Vis
import matplotlib.pyplot as plt
%matplotlib inline
plot_target = ['loss', 'val_loss', 'accuracy', 'val_accuracy']
plt.figure(figsize=(12, 8))
for each in plot_target:
plt.plot(hist.history[each], label=each)
plt.legend()
plt.grid()
plt.show()
반응형
'Data Science > Tensorflow & Pytorch' 카테고리의 다른 글
[Tensorflow] 회귀(Regression) 신경망 실습 (0) | 2022.01.07 |
---|---|
[Tensorflow] Tensorflow로 신경망 구조 만들기 (0) | 2022.01.07 |
딥러닝 기본 용어 잡기(Introduction to Deep Learning) (0) | 2022.01.07 |
댓글