Perform Dataset Testing in Batches under Operation Mode

You are viewing an old version of the documentation. You can switch to the documentation of the latest version by clicking the top-right corner of the page.

Introduction

You can perform batch-wise testing on the dataset in the Operation Mode. This feature is for the Defect Segmentation module and the Classification module. It is unavailable under the cascade mode.

After model training, select Tools  Operation Mode in the menu bar to open the window of Operation Mode.

Once data is imported, you need to perform a manual check.

After data check, click the Export report button to get a report on accuracy, GPU usage, inference time, etc.

Use the Operation Mode

Steps

  1. Select data source

    Regarding Data Source, you can select Mech-DLK or Folder.

    Mech-DLK

    Use all images in the current project to demonstrate the performance of the trained model.

    Folder

    Use massive new data for model testing. After selecting the image folder path, import all images in the folder as a new dataset. The new dataset imported will be independent of the original datasets in the project.

    If some new images were imported into the current project, in addition to the training set and validation set, these new images will also be imported when you select Mech-DLK as the data source.
  2. Load the model

    Click Load Model. After loading successfully, click Next to enter the inference interface.

  3. Inference

    Click infer to start the inference. Once inference is completed, relevant data such as inference time and GPU usage will be displayed under Running information.

  4. Perform a manual check

    Perform a manual check on inference results. Click right when the inference result is correct and click wrong when the inference result is incorrect.

  5. Export the report

    After the manual check, you can click Export report to learn about the inference time, GPU usage, accuracy, etc.

Running Information

  • Inference time

    Display of Image inference time and Average inference time.

  • GPU usage

    Display of Current GPU usage.

We Value Your Privacy

We use cookies to provide you with the best possible experience on our website. By continuing to use the site, you acknowledge that you agree to the use of cookies. If you decline, a single cookie will be used to ensure you're not tracked or remembered when you visit this website.