[OpenR8 solution] Image_PCB_YOLOv3_Keras (Using YOLOv3 algorithm and Keras framework for PCB object detection)
  1. Chapter 1: Image_PCB_YOLOv3_Keras Introduction

 

Image_PCB_YOLOv3_Keras is the use of Keras's YOLO third version-yolov3 to judge the target for category, detection.

YOLO algorithm, its advantage is that the detection speed and SSD is almost fast, the accuracy is also high, the solution here uses YOLOv3 to determine the position of the capacitor, the main use of the process is Fig. 1, each step in each section of this document will be detailed in sequence.

 

Fig. 1. process.png

Fig. 1. process.

 

※ Since this solution is to use Python, there are installation, settings, references, and so on for Python, please refer to "Tool/python/python Installation.txt" within OpenDBT.

 

 

  1. Chapter 2: Image_PCB_YOLOv3_Keras File Introduction

 

Image_PCB_YOLOv3_Keras is in the solution folder of OpenR8, which contains:

  1. Folders: “data”, “src”.
  2. flow files: “1_train_cpu.flow”, “1_train_gpu.flow”, “2_inference.flow”.

※ For the first time users, it is recommended to change only the contents of the files in the data folder. After you are familiar with them, you can change them to the desired location.

 

Fig. 2. Image_PCB_YOLOv3_Keras location.png

Fig. 2. Image_PCB_YOLOv3_Keras location.

 

Fig. 3. Image_PCB_YOLOv3_Keras Folder.png

Fig. 3. Image_PCB_YOLOv3_Keras Folder.

 

name

Use and function

data

The sample picture, sample category, and database  where are placed, its folder contents Table 2 will be explained.

src

Python files that 1_train_cpu.flow, 1_train_gpu.flow, and 2_inference.flow will use.

1_train_cpu.flow 1_train_gpu.flow

Select CPU or GPU mode to train samples.

2_inference.flow

Inference samples.

Tabel 1. Image_PCB_YOLOv3_Keras folder introduction.

 

Fig. 4. Image_PCB_YOLOv3_Keras s data folder.png

Fig. 4. Image_PCB_YOLOv3_Keras s data folder.

 

data folder

use

train_annot_folder

Place the folder for the training sample category.

train_image_folder

Place the folder for the training sample image.

valid_annot_folder

Place the folder for the test sample category.

vaild_image_folder

Place the folder for the test sample image.

config.json config_gpu.json

1_train_cpu.flow or 1_train_gpu.flow parameter settings are used to set some parameters for training. For a description of the parameters, refer to Table 3.

pcb_train.pkl

pcb_valid.pkl

Train and test's label file.

backend.h5

pcb.h5

“backend.h5” is pretrained model; “pcb.h5" is the model produced after training.

Table 2. Image_PCB_YOLOv3_Keras’s data folder introduction.

 

 

 

  1. Chapter 3: Prepare Sample

Before we train, we must decide the direction, take this file as an example, we want to detect the location of the capacitance, so we have to label the sample image one by one of their category (capacitance), in order to join the training.

 

Step1: Open “labelImg.exe”

Open “LABELIMG.EXE” to label sample we want to train.

 

Fig. 5. labelImg.exe.jpg

Fig. 5. labelImg.exe.

 

Step2: Select sample images storage folder

Click “Open Dir” to open the folder location where the sample is placed, as in Fig. 6, to place the image in “data/train_image_folder” (depending on where everyone is placed, the location of the folder is different, and the first use of the recommendation is placed in “data/train_image_folder” replaces the original), and then press "Select Folder".

 

Fig. 6. Open the location of the sample image.png

Fig. 6. Open the location of the sample image.

 

Step3: Select to store the category label folder.

Click on Save Dir to select the storage category label folder location, as in Fig. 7, to place the xml files in "data/train_annot_folder" (depending on where each person is placed, the location of the folder is also different, the first use of the recommendations placed in the "data/train_annot_folder"), and then press "Select Folder".

 

Fig. 7. Location of folders stored after selecting a sample label category.png

Fig. 7. Location of folders stored after selecting a sample label category.

 

Step4: sample image bounding box and label category.

Press Create RectBox to label the bounding box of the sample, then select the category. One image is not limited to the same category. After the box is labeled, press Save to save the file of the label category. Press Next Image to continue to select the next sample image until all sample images are marked with categories.

 

Fig. 8. training image.jpg

Fig. 8. training image.

 

Fig. 9. click Create RectBox to label the image.jpg

Fig. 9. click Create RectBox to label the image.

 

Fig. 10. After selecting the sample category press OK.jpg

Fig. 10. After selecting the sample category press OK.

 

Fig. 11. Box the sample category and press Save..jpg

Fig. 11. Box the sample category and press Save.

 

Fig. 12. Press Next Image to continue to the next image.jpg

Fig. 12. Press Next Image to continue to the next image.

 

 ※ For the use of other LabelImg.exe, please refer to [ezai simple AI] labelimg usage method (Windows version).

 

 

  1. Chapter 4: Run 1_train_cpu.flow or 1_train_gpu.flow

 

There is “R8.exe” under the “OpenR8 folder”, as shown in Fig. 13. Double-Click to run R8.exe.

 

Fig. 13. Open R8.png

Fig. 13. Open R8.exe.

 

Click “File” => “Open” => “the solution folder under OpenR8” => “select Image_PCB_YOLOv3_Keras folder” => “Open 1_train_cpu.flow” or “Open 1_train_gpu.flow”, as shown in Fig. 15,Fig. 16.

 

※You can select different files according to whether your hardware device supports GPU acceleration. If you support GPU, you can choose 1_train_gpu.flow to speed up training. If you don't support GPU, please choose 1_train_cpu.flow training, such as Fig. 18.

 

※Before running, make sure that the PKL files have been deleted, such as Fig. 14.

 

※Before running, if you don't want to use your own trained model as a pretrained model, delete all h5 files (but keep backend.h5).

 

Fig. 14. delete pkl files.png

Fig. 14. delete pkl files.

 

Fig. 15. Select 1_train_cpu.flow.png

Fig. 15. Select 1_train_cpu.flow.

 

Fig. 16. Open 1_train_cpu.flow.png

Fig. 16. Open 1_train_cpu.flow.

 

  1. python:

Execute the train.py file to train the sample.

 

result:Parameters are not filled in by default, and if you need to look at the returned content, you can new the parameters yourself.

 

path: To execute the python file, here is the "src/train.py" to be executed. arg: To pass to the parameters required for the execution of the python file, the "-c data/config.json" is passed here.

 

pythonPath:Parameters are not filled in by default, if "train.py" is not in the original location (src folder under the Image_pcb_ssd_keras folder), you need to add their own parameters to modify the "train.py" path.

 

Fig. 17. 1_train_cpu.flow s python function.png

Fig. 17. 1_train_cpu.flow s python function.

 

Once you have confirmed all the parameters, you can press to perform the start training sample until you jump out of "press any key to continue...".

 

※Before running, if you want to change the "training model", "Number of training", "Classification category" ... such as parameter setting, please see Chapter 6 ―config.json.

 

If the open file is "1_train_gpu.flow", the parameters also change somewhat.

 

Fig. 18. 1_train_gpu.flow s python function.png

Fig. 18. 1_train_gpu.flow s python function.

 

  1. python:

Execute the train.py file to train the sample.

 

result:Parameters are not filled in by default, and if you need to look at the returned content, you can new the parameters yourself.

 

path: To execute the python file, here is the "src/train.py" to be executed. arg: To pass to the parameters required for the execution of the python file, the "-c data/config_gpu.json" is passed here.

 

pythonPath:Parameters are not filled in by default, if "train.py" is not in the original location (src folder under the Image_pcb_ssd_keras folder), you need to add their own parameters to modify the "train.py" path.

 

 

  1. Chapter 5: Run 2_inference.flow inference results

 

At the end of the 1_train_cpu.flow or 1_train_gpu.flow training, Open the 2_inference.flow inference images, such as Fig. 19、Fig. 20.

 

Fig. 19. Select 2_inference.flow.png

Fig. 19. Select 2_inference.flow.

 

Fig. 20. Open 2_inference.flow.png

Fig. 20. Open 2_inference.flow.

 

  1. python:

Execute the predice.py file to infer the sample.

 

result:Parameters are not filled in by default, and if you need to look at the returned content, you can new the parameters yourself. path: To execute the python file, here is the " src/predict.py " to be executed.

 

arg: To pass to the parameters required for the execution of the python file, the -c data/config.json -i data/valid_image_folder/13.png " is passed here.

pythonPath:Parameters are not filled in by default, if " predice.py" is not in the original location (src folder under the Image_pcb_ssd_keras folder), you need to add their own parameters to modify the " predice.py" path.

 

Fig. 21. 2_inference.flow s python function.png

Fig. 21. 2_inference.flow s python function.

 

※Depending on the training, use Config.json or Config_gpu.json to fill in the red character section of the above parameters, for example: "-c data/config_gpu.json -i data/valid_image_folder/13.png".

 

Once you have confirmed the parameters, you can press the execution to see the results.

 

Fig. 22. 2_inference.flow s result.png

Fig. 22. 2_inference.flow s result.

 

※Additional Instructions:

 

In the function of Fig. 21, the parameter field “-c data/config.json -i data/valid_image_folder/13.png”,

If you want to test other images:

 

Please change data/valid_image_folder/13.png to the image path you want to test, for example data/valid_image_folder/11.png.

If you want to test the picture of the entire folder:

 

Please change data/valid_image_folder/13.png to the path of the test folder, for example data/valid_image_folder.

 

 

  1. Chapter 6: config.json Introduction

 

The difference between Config.json and Config_gpu.json is that "GPUs" fills in the difference between "1", "0", or "", and When "GPUs": "", it means that GPU acceleration is not used.

 

Fig. 23. config_gpu.json s content.jpg

Fig. 23. config_gpu.json s content.

 

Fig. 23 is the content of config.json, and the parameters that may be changed are shown in Table 3.

 

Parameter

Parameter value

Parameter description

labels

["capacitor"]

The name of the marked category, as an example of this document, is only labeled "capacitor", if you want to add another category, its style is ["Capacitor", "New Add category name", "New Add category Name 2" ...] and so on, as shown in Fig. 24.

train_image_folder

"data/train_image_folder/"

The folder location where the training sample image is placed.

train_annot_folder

"data/train_annot_folder/"

The folder location where the training sample label is placed.

gpus

“”

If there is to run GPU acceleration, depending on the number of GPUs you want to run, suppose you have two GPUs, and when you want to use a later GPU, you can change the value to "1", and if there is only one GPU, the value is "0".

saved_weights_name

“pcb.h5”

Read the file name of the h5 model.

valid_image_folder

"data/ valid_image_folder/"

The folder location where the test sample image is placed

valid_annot_folder

"data/ valid _annot_folder/"

The folder location where the test sample label is placed

Table 3. config.json’s parameter introduction.

 

Fig. 24..png

Fig. 24. labels: add category.

 

Fig. 25..png

Fig. 25. gpus: add number of gpu.

 


Recommended Article

1.
OpenR8 - AI Software for Everyone (Download)