We develop a machine vision system as a key component of a robot‐assisted packaging system, which can guide the robot arms to pack the roast sauries into cans. For gripping strategy generation, the system is required not only to be able to detect the roast saury area but also estimate the geometric parameters. Besides, according to different canning requirements, it is also necessary to distinguish the type of fish parts. Facing these challenges, we propose a novel rule‐based matching method combined with an improved efficient graph‐based image segmentation (EGIS) method for sensing the fish part. Specifically, the matching method applies our originally designed rule‐based similarity under a genetic algorithm framework combined with deterministic crowding technique, which is used to sensing one type of fish parts. On the other hand, we improve the EGIS by introducing a shape restriction to deal with leftover fish parts. The experiments are implemented for two different types of fish part in the real factory environment. The result of our method achieved a mean location accuracy of 93.5% with a practical average processing time of 2.6 s per image. © 2020 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.