Тёмный

MultiScale Feature Fusion CNN model with Keras 

Mhathesh TSR
Подписаться 855
Просмотров 2,7 тыс.
50% 1

How to implement the #MultiScale #CNN network with #Keras from scratch, create an image data generator to feed the images into the CNN models. Here video contains how to create two CNN models and combine the output for classification, also how to create more than two CNN models compile and validate the model.
Video explains each line of code and their functions in-depth and for beginners level understanding.Have little understading of basics CNN model to code this model.
For basic of CNN:
• Implementation of Mult...
For the multiscale Transfer Learning model :
• Implementation of Mult...
For reinforcement Learning with Stable baseline Framework:
• Playlist

Наука

Опубликовано:

 

15 апр 2020

Поделиться:

Ссылка:

Скачать:

Готовим ссылку...

Добавить в:

Мой плейлист
Посмотреть позже
Комментарии : 13   
@assaddoutoum7169
@assaddoutoum7169 11 месяцев назад
it interesting video and very clear thanks. I use the code but the accuracy reached up to 92 and then started decreasing. is this the ways how it works?
@bhuvaneshs.k638
@bhuvaneshs.k638 3 года назад
Do u know how to merge 2 layers of (None, n) shape where n is some numbers.... To (None, n, n) dimension layer using Kronecker product in Keras ?
@ayushkoirala4345
@ayushkoirala4345 4 года назад
can you please tell what kind of dataset you have used
@ajaykrishangairola3269
@ajaykrishangairola3269 Год назад
@mhathesh please share code?
@intensewolf9101
@intensewolf9101 3 года назад
Thank you very much. What is your GitHub? May you please share this code?
@043_fazlerabbi5
@043_fazlerabbi5 Год назад
tnx....@MhatheshTSR
@043_fazlerabbi5
@043_fazlerabbi5 Год назад
from keras.models import Sequential from keras.constraints import * from keras.optimizers import * from keras.utils import np_utils from keras import Model from keras.layers import * from keras.preprocessing.image import ImageDataGenerator #original size (696, 520) f=256 s=256 #first model main_model = Sequential() main_model.add(Conv2D(32, kernel_size=3, input_shape=(f, s, 1),activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) main_model.add(Conv2D(32, kernel_size=3,activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) main_model.add(Conv2D(64, kernel_size=3,activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) #main_model.add(Conv2D(64, kernel_size=3,activation='relu')) #main_model.add(BatchNormalization()) #main_model.add(MaxPool2D(strides=(5,5))) #main_model.add(Dropout(0.5)) main_model.add(Flatten()) #lower features model - CNN2 lower_model1 = Sequential() lower_model1.add(MaxPool2D(strides=(5,5), input_shape=(f, s,1))) lower_model1.add(Conv2D(32, kernel_size=3,activation='relu')) lower_model1.add(BatchNormalization()) lower_model1.add(MaxPool2D(strides=(5,5))) lower_model1.add(Dropout(0.5)) lower_model1.add(Conv2D(32, kernel_size=3,activation='relu')) lower_model1.add(BatchNormalization()) lower_model1.add(MaxPool2D(strides=(5,5))) lower_model1.add(Dropout(0.5)) #lower_model1.add(Conv2D(64, kernel_size=3,activation='relu')) #lower_model1.add(BatchNormalization()) #lower_model1.add(MaxPool2D(strides=(5,5))) #lower_model1.add(Dropout(0.5)) lower_model1.add(Flatten()) #merged model merged_model = Concatenate()([main_model.output, lower_model1.output]) x = Dense(128, activation='relu')(merged_model) x = Dropout(0.25)(x) x = Dense(64, activation='relu')(x) x = Dropout(0.25)(x) x = Dense(32, activation='relu')(x) output = Dense(3, activation='softmax')(x) # add in dense layer activity_regularizer=regularizers.l1(0.01) final_model = Model(inputs=[main_model.input, lower_model1.input], outputs=[output]) final_model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) traindir1="/content/drive/My Drive/DL_2_Dataset/bbrod/train" traindir2="/content/drive/My Drive/DL_2_Dataset/bbrod/train" testdir1="/content/drive/My Drive/DL_2_Dataset/bbrod/train" testdir2="/content/drive/My Drive/DL_2_Dataset/bbrod/train" input_imgen = ImageDataGenerator(rescale = 1./255, rotation_range=80, width_shift_range=0.6, height_shift_range=0.5, horizontal_flip=True,zoom_range=0.8,vertical_flip=True, validation_split=0.4) test_imgen = ImageDataGenerator(rescale = 1./255) batch_size=16 def generate_generator_multiple(generator,dir1, dir2, batch_size, img_height,img_width,subset): genX1 = generator.flow_from_directory(dir1, target_size = (img_height,img_width), class_mode = 'categorical', batch_size = batch_size, shuffle=False, color_mode='grayscale', seed=7,subset=subset) genX2 = generator.flow_from_directory(dir2, target_size = (img_height,img_width), class_mode = 'categorical', batch_size = batch_size, shuffle=False, color_mode='grayscale', seed=7,subset=subset) while True: X1i = genX1.next() X2i = genX2.next() yield [X1i[0], X2i[0]], X2i[1] inputgenerator=generate_generator_multiple(generator=input_imgen, dir1=traindir1, dir2=traindir2, batch_size=batch_size, img_height=f, img_width=s,subset="training") testgenerator=generate_generator_multiple(input_imgen, dir1=testdir1, dir2=testdir2, batch_size=batch_size, img_height=f, img_width=s,subset="validation") history=final_model.fit_generator(inputgenerator, #steps_per_epoch=trainsetsize/batch_size, steps_per_epoch=250 , epochs = 100, validation_data = testgenerator, validation_steps = 100, shuffle=False)
@engineersmenu
@engineersmenu 3 года назад
Code File Available?
@043_fazlerabbi5
@043_fazlerabbi5 Год назад
from keras.models import Sequential from keras.constraints import * from keras.optimizers import * from keras.utils import np_utils from keras import Model from keras.layers import * from keras.preprocessing.image import ImageDataGenerator #original size (696, 520) f=256 s=256 #first model main_model = Sequential() main_model.add(Conv2D(32, kernel_size=3, input_shape=(f, s, 1),activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) main_model.add(Conv2D(32, kernel_size=3,activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) main_model.add(Conv2D(64, kernel_size=3,activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) #main_model.add(Conv2D(64, kernel_size=3,activation='relu')) #main_model.add(BatchNormalization()) #main_model.add(MaxPool2D(strides=(5,5))) #main_model.add(Dropout(0.5)) main_model.add(Flatten()) #lower features model - CNN2 lower_model1 = Sequential() lower_model1.add(MaxPool2D(strides=(5,5), input_shape=(f, s,1))) lower_model1.add(Conv2D(32, kernel_size=3,activation='relu')) lower_model1.add(BatchNormalization()) lower_model1.add(MaxPool2D(strides=(5,5))) lower_model1.add(Dropout(0.5)) lower_model1.add(Conv2D(32, kernel_size=3,activation='relu')) lower_model1.add(BatchNormalization()) lower_model1.add(MaxPool2D(strides=(5,5))) lower_model1.add(Dropout(0.5)) #lower_model1.add(Conv2D(64, kernel_size=3,activation='relu')) #lower_model1.add(BatchNormalization()) #lower_model1.add(MaxPool2D(strides=(5,5))) #lower_model1.add(Dropout(0.5)) lower_model1.add(Flatten()) #merged model merged_model = Concatenate()([main_model.output, lower_model1.output]) x = Dense(128, activation='relu')(merged_model) x = Dropout(0.25)(x) x = Dense(64, activation='relu')(x) x = Dropout(0.25)(x) x = Dense(32, activation='relu')(x) output = Dense(3, activation='softmax')(x) # add in dense layer activity_regularizer=regularizers.l1(0.01) final_model = Model(inputs=[main_model.input, lower_model1.input], outputs=[output]) final_model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) traindir1="/content/drive/My Drive/DL_2_Dataset/bbrod/train" traindir2="/content/drive/My Drive/DL_2_Dataset/bbrod/train" testdir1="/content/drive/My Drive/DL_2_Dataset/bbrod/train" testdir2="/content/drive/My Drive/DL_2_Dataset/bbrod/train" input_imgen = ImageDataGenerator(rescale = 1./255, rotation_range=80, width_shift_range=0.6, height_shift_range=0.5, horizontal_flip=True,zoom_range=0.8,vertical_flip=True, validation_split=0.4) test_imgen = ImageDataGenerator(rescale = 1./255) batch_size=16 def generate_generator_multiple(generator,dir1, dir2, batch_size, img_height,img_width,subset): genX1 = generator.flow_from_directory(dir1, target_size = (img_height,img_width), class_mode = 'categorical', batch_size = batch_size, shuffle=False, color_mode='grayscale', seed=7,subset=subset) genX2 = generator.flow_from_directory(dir2, target_size = (img_height,img_width), class_mode = 'categorical', batch_size = batch_size, shuffle=False, color_mode='grayscale', seed=7,subset=subset) while True: X1i = genX1.next() X2i = genX2.next() yield [X1i[0], X2i[0]], X2i[1] inputgenerator=generate_generator_multiple(generator=input_imgen, dir1=traindir1, dir2=traindir2, batch_size=batch_size, img_height=f, img_width=s,subset="training") testgenerator=generate_generator_multiple(input_imgen, dir1=testdir1, dir2=testdir2, batch_size=batch_size, img_height=f, img_width=s,subset="validation") history=final_model.fit_generator(inputgenerator, #steps_per_epoch=trainsetsize/batch_size, steps_per_epoch=250 , epochs = 100, validation_data = testgenerator, validation_steps = 100, shuffle=False)
@meraphone2885
@meraphone2885 3 года назад
Please share your code
@043_fazlerabbi5
@043_fazlerabbi5 Год назад
from keras.models import Sequential from keras.constraints import * from keras.optimizers import * from keras.utils import np_utils from keras import Model from keras.layers import * from keras.preprocessing.image import ImageDataGenerator #original size (696, 520) f=256 s=256 #first model main_model = Sequential() main_model.add(Conv2D(32, kernel_size=3, input_shape=(f, s, 1),activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) main_model.add(Conv2D(32, kernel_size=3,activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) main_model.add(Conv2D(64, kernel_size=3,activation='relu')) main_model.add(BatchNormalization()) main_model.add(MaxPool2D(strides=(5,5))) main_model.add(Dropout(0.5)) #main_model.add(Conv2D(64, kernel_size=3,activation='relu')) #main_model.add(BatchNormalization()) #main_model.add(MaxPool2D(strides=(5,5))) #main_model.add(Dropout(0.5)) main_model.add(Flatten()) #lower features model - CNN2 lower_model1 = Sequential() lower_model1.add(MaxPool2D(strides=(5,5), input_shape=(f, s,1))) lower_model1.add(Conv2D(32, kernel_size=3,activation='relu')) lower_model1.add(BatchNormalization()) lower_model1.add(MaxPool2D(strides=(5,5))) lower_model1.add(Dropout(0.5)) lower_model1.add(Conv2D(32, kernel_size=3,activation='relu')) lower_model1.add(BatchNormalization()) lower_model1.add(MaxPool2D(strides=(5,5))) lower_model1.add(Dropout(0.5)) #lower_model1.add(Conv2D(64, kernel_size=3,activation='relu')) #lower_model1.add(BatchNormalization()) #lower_model1.add(MaxPool2D(strides=(5,5))) #lower_model1.add(Dropout(0.5)) lower_model1.add(Flatten()) #merged model merged_model = Concatenate()([main_model.output, lower_model1.output]) x = Dense(128, activation='relu')(merged_model) x = Dropout(0.25)(x) x = Dense(64, activation='relu')(x) x = Dropout(0.25)(x) x = Dense(32, activation='relu')(x) output = Dense(3, activation='softmax')(x) # add in dense layer activity_regularizer=regularizers.l1(0.01) final_model = Model(inputs=[main_model.input, lower_model1.input], outputs=[output]) final_model.compile(loss='categorical_crossentropy', optimizer='adam', metrics=['accuracy']) traindir1="/content/drive/My Drive/DL_2_Dataset/bbrod/train" traindir2="/content/drive/My Drive/DL_2_Dataset/bbrod/train" testdir1="/content/drive/My Drive/DL_2_Dataset/bbrod/train" testdir2="/content/drive/My Drive/DL_2_Dataset/bbrod/train" input_imgen = ImageDataGenerator(rescale = 1./255, rotation_range=80, width_shift_range=0.6, height_shift_range=0.5, horizontal_flip=True,zoom_range=0.8,vertical_flip=True, validation_split=0.4) test_imgen = ImageDataGenerator(rescale = 1./255) batch_size=16 def generate_generator_multiple(generator,dir1, dir2, batch_size, img_height,img_width,subset): genX1 = generator.flow_from_directory(dir1, target_size = (img_height,img_width), class_mode = 'categorical', batch_size = batch_size, shuffle=False, color_mode='grayscale', seed=7,subset=subset) genX2 = generator.flow_from_directory(dir2, target_size = (img_height,img_width), class_mode = 'categorical', batch_size = batch_size, shuffle=False, color_mode='grayscale', seed=7,subset=subset) while True: X1i = genX1.next() X2i = genX2.next() yield [X1i[0], X2i[0]], X2i[1] inputgenerator=generate_generator_multiple(generator=input_imgen, dir1=traindir1, dir2=traindir2, batch_size=batch_size, img_height=f, img_width=s,subset="training") testgenerator=generate_generator_multiple(input_imgen, dir1=testdir1, dir2=testdir2, batch_size=batch_size, img_height=f, img_width=s,subset="validation") history=final_model.fit_generator(inputgenerator, #steps_per_epoch=trainsetsize/batch_size, steps_per_epoch=250 , epochs = 100, validation_data = testgenerator, validation_steps = 100, shuffle=False)
@043_fazlerabbi5
@043_fazlerabbi5 Год назад
please code ?
@043_fazlerabbi5
@043_fazlerabbi5 Год назад
tnx Mhathesh TSR
Далее
ЮТУБ ТОЧНО ВСЕ!
11:23
Просмотров 863 тыс.
Convolution Operation in CNN
10:58
Просмотров 100 тыс.
What is Feature Fusion in AI?
3:53
Просмотров 52
143 - Multiclass classification using Keras
11:35
Просмотров 34 тыс.
How to Concatenate Keras Layers
6:08
Просмотров 8 тыс.
КАКОЙ SAMSUNG КУПИТЬ В 2024 ГОДУ
14:59
Мой новый мега монитор!🤯
1:00