(19)
(11) EP 4 401 048 A8

(12) CORRECTED EUROPEAN PATENT APPLICATION
Note: Bibliography reflects the latest situation

(15) Correction information:
Corrected version no 1 (W1 A1)

(48) Corrigendum issued on:
28.08.2024 Bulletin 2024/35

(43) Date of publication:
17.07.2024 Bulletin 2024/29

(21) Application number: 23218605.6

(22) Date of filing: 20.12.2023
(51) International Patent Classification (IPC): 
G06V 20/00(2022.01)
G06V 10/774(2022.01)
G06F 18/2413(2023.01)
G06V 10/80(2022.01)
(52) Cooperative Patent Classification (CPC):
G06F 18/24147; G06V 10/7747; G06V 10/811; G06V 20/95; G06V 2201/07
(84) Designated Contracting States:
AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC ME MK MT NL NO PL PT RO RS SE SI SK SM TR
Designated Extension States:
BA
Designated Validation States:
KH MA MD TN

(30) Priority: 11.01.2023 IL 29981723

(71) Applicants:
  • FUJITSU LIMITED
    Kawasaki-shi, Kanagawa 211-8588 (JP)
  • B.G. NEGEV TECHNOLOGIES AND APPLICATIONS LTD., at Ben-Gurion University
    841050 Beer Sheva (IL)

(72) Inventors:
  • Hofman, Omer
    841050 Beer Sheva (IL)
  • Giloni, Amit
    841050 Beer Sheva (IL)
  • Morikawa, Ikuya
    Kawasaki-shi, Kanagawa, 211-8588 (JP)
  • Shimizu, Toshiya
    Kawasaki-shi, Kanagawa, 211-8588 (JP)
  • Elovici, Yuval
    841050 Beer Sheva (IL)
  • Shabtai, Asaf
    841050 Beer Sheva (IL)

(74) Representative: Hoffmann Eitle 
Patent- und Rechtsanwälte PartmbB Arabellastraße 30
81925 München
81925 München (DE)

   


(54) DETERMINATION PROGRAM, DETERMINATION METHOD, AND INFORMATION PROCESSING APPARATUS


(57) An information processing apparatus extracts (22) a projection area of an object by inputting an input scene containing a projection of the object to a predetermined projection area extraction model, specifies (23) each of second areas that are included in the extracted projection area of the object and that are similar to a plurality of first areas representing a feature of each of labels in a feature space, outputs (23, 24) a first classification result by performing a first classification process for classifying the objects based on each of the plurality of labels indicating a state in which a distribution of combinations of the specified second areas and the first areas that are associated with the second areas is closer than a predetermined threshold, and determines (25) whether or not an adversarial patch is included in the object by comparing the first classification result with respect to the input scene with a second classification result that is a result of classification of the objects obtained by inputting the input scene to a predetermined object detection model.