Acoustic radiation force impulse (ARFI)-VTI elastography
Breast cancer is the most common cancer among women in Taiwan, and the number of breast cancer cases reported annually continues to increase. In 2018, breast cancer ranked fourth in terms of mortality. Early stages (stages 0–2) of malignant breast lesions can be diagnosed during regular screening, and early treatment via advanced medical therapies increases survival rates. Ultrasound imaging, including acoustic radiation force impulse (ARFI) imaging, is the first-line examination technique used to locate breast lesion tissue, which can then be quantitated by virtual touch tissue imaging (VTI). ARFI-VTI elastography is a breast imaging modality that creates two-dimensional (2D) images to visualize the texture details, elasticity, and morphological features of a region of interest (ROI). The 2D Harris corner convolution is applied during digital imaging to remove speckle noise and enhance the ARFI-VTI images for extrapolation of lesion tissue in a ROI. Then, 2D Harris corner convolution, maximum pooling, and random decision forests (RDF) are integrated into a machine vision classifier to screen subjects with benign or malignant tumors. A total of 320 ARFI-VTI images were collected for experiments. In training stages, 122 images were randomly selected to train the RDF-based classifiers and the remaining images were randomly selected for performance evaluation via cross-validation in recalling stages. In a 10-fold cross-validation, promising results with mean sensitivity, mean specificity, and mean accuracy of 86.02%, 87.63%, and 86.97%, respectively, are achieved for quantifying the performance of the proposed classifier. Breast tumors visualized on ARFI-VTI images can be used for rapid screening of malignant or benign lesions by using the proposed machine vision classifier.
Endoscopic photoacoustic with deep learning
Colorectal cancer is increasing rapidly every year. At present, traditional medical endoscopic probes are large in size, slow in imaging, and insufficient resolution. Here, we propose a system combining a customized probe-based photoacoustic system with deep learning to improve photoacoustic hardware to achieve a smaller size of the probe and a higher imaging resolution. On the other hand, deep learning was imbedded into the proposed photoacoustic system to achieve highly accurate results of the classifications for potentially helping physicians to identify lesions as the second opinion. In this study, a customized probe with a diameter of 9 mm was used to replace the original probe with a diameter of 1 cm in the hospital. The laser provides 15 W average power with an approximate 300 μJ pulse energy. The laser beam is formed by the lens after focusing, and the fiber couple can convert the light into parallel light. After the probe rotates 360°, a slip ring will trigger the ultrasound to receive particle displacements caused by the thermal propagations. A GRIN lens with a diameter of 1 mm was used to focus the scattered light. The proposed system can generate an 800 μm resolution. The collected images are classified by deep learning algorithms, including AlexNet, GoogLeNet, and ResNet, to differentiate polyps from tumors. After comparing these four image classification methods, ResNet_18 is finally used for image classification, which helps the attending doctor reduce fatigue and quickly identify a disease.

Wireless capsule ultrasound endoscopy
We demonstrated the feasibility of the proposed wireless ultrasound capsule endoscopy by imaging in a 3% gelatin mixed with 0.5% agar phantom. The capsule utilized a series-connected thin-film battery power supply circuit. It was equipped with an electro-acoustic lens to shift the ultrasound beam angle, avoiding the need for complicated rotating structures and extending the power supply time by approximately 3 hours. The proposed capsule ultrasound endoscopy is simple and compact with an outer shell with an M20 thread. The capsule ultrasound endoscopy successfully image the phantom at a depth of 8 cm underwater with a lateral resolution of 138 μm.

Ultrasound device for drug delivery system
Most traditional wireless ultrasound endoscopic capsules (USEC) currently available on the market only offer imaging functions. Due to the development of ultrasound (US) assisted puncture technology, the design is aimed at injecting microcapsules in diabetic patients, using a Membrane liquid lens (MLL) to cooperate with US probes. The injection system is used to achieve intragastric insulin injection in the body, adjust the MLL observation and judge the injection position through US imaging, solve the risk of perforation and imaging problems, not only find the problem point and inject insulin accurately but also replace the power consumption problem of motors on the market.
Through the design of preclinical animal experiments, this design realizes the combination of in vivo positioning US observation and drug injection devices. Using MLL with a ultrasound probe, it can observe the returned US image and inject drugs. This method perfectly replaces the motor rotation and increases the operation time. The viewing angle can be greater than 45 degrees, which demonstrates the capability of the present system as a safe drug delivery device.