[0001] The present invention relates to a vehicle, an apparatus, a method, and a computer
program for monitoring a vehicle, a mobile transceiver, an apparatus, a method, and
a computer program for a mobile transceiver, an application server, an apparatus,
a method, and a computer program for an application server, more particularly, but
not exclusively, to a concept for obtaining image data of an outside of a vehicle
if a trigger event occurs.
[0002] Vehicles may be equipped with a variety of sensors. Different sensor data of a vehicle,
its components, a driving situation and other road users may be available. In some
situations, information on a status of a parked vehicle may be desirable. For example,
applications of mobile cell phones may be used to monitor certain conditions of a
parked vehicle. Examples are tire pressure, temperature inside and outside, fuel level,
maintenance interval, etc.
[0003] Some applications allow taking images from around a vehicle. For example, document
EP 1 367 408 A2, discloses a concept for synthesizing an image showing a situation around a car,
which is produced from images taken with cameras capturing the surroundings of the
vehicle. Document
EP 2 905 704 A1 describes a concept for a vehicle, which collects sensor data for provision to an
owner in case an event is triggered. Such event can be triggered if the vehicle is
parked and shock sensors register that the vehicle was hit or indicate a collision.
A user may activate internal cameras of a vehicle from remote (via Internet), e.g.
to check whether a wallet was left inside the car. Document
EP 2 750 116 A1 describes concept for automated charging for parking vehicles. Document
EP 1 464 540 A1 discloses a concept for a movable bumper camera using a reflecting mirror to determine
an image including a break line showing the outermost part of the vehicle. These monitoring
options for a parked vehicle are limited.
[0004] There is a demand for an improved concept for controlling a parked vehicle. The independent
claims provide an improved concept for controlling a parked vehicle.
[0005] Embodiments are based on the finding that in some situations it is desirable to monitor
the status of the parked vehicle also from the outside. A request for the vehicle
status can have several reasons. Examples are curiosity of the owner or damage of
the vehicle. Such request may be initiated on a regular basis or based on an event.
It is another finding that a wireless communication system offers the possibility
to reach or to communicate with other vehicles remotely. Other vehicles may be able
to read/recognize the status data of the vehicle. Status data in this context may
refer to the appearance of the vehicle or the relation between the (target) vehicle
and its vicinity. For instance, an owner of a vehicle would like to see the vehicle
to check the condition of the vehicle body or the surrounding after unpleasant nature
event (e.g. storm or hail). Embodiments are further based on the finding that this
kind of status information may be mostly or in some cases even only measured or be
seen by external observers. Embodiments may enable a service to get these observations
(photos or videos) of a target vehicle by other vehicles, which are equipped with
cameras and a communication unit. Embodiments are further based on the finding that
communication of information on a trigger event, a request for image data, and image
data itself can be implemented in various communication architectures, e.g. via an
application server, directly between vehicles, or based on communication with an owner
device.
[0006] Embodiments provide an apparatus for monitoring a vehicle. Another embodiment is
a vehicle with an embodiment of the apparatus. The apparatus comprises one or more
interfaces, which are configured to communicate in a mobile communication system.
The apparatus further comprises a control module, which is configured to control one
or more interfaces. The control module is further configured to determine a parking
situation of the vehicle and to detect a trigger event. The control module is further
configured to transmit a message comprising information on the trigger event to a
predefined device in case the trigger event is detected. Embodiments provide a concept
for monitoring a vehicle by transmitting an according message based on the trigger
event. For example, the predefined device is an application server or user equipment,
e.g. such a message may be transmitted to another vehicle, an application server,
or an owner's device. The event may be triggered frequently or periodically. Based
on the message, embodiments enable communication with other devices, e.g. other vehicles
or network components in a parking situation of the vehicle.
[0007] Embodiments also provide an apparatus for an application server, which is configured
to communicate through a mobile communication network. Another embodiment is an application
server comprising an embodiment of the application server apparatus. The apparatus
comprises one or more interfaces, which are configured to communicate in the mobile
communication system. The apparatus further comprises a control module, which is configured
to control the one or more interfaces. The control module is further configured to
obtain a request for obtaining image data of a first vehicle, and to determine a second
vehicle, which is capable of determining such image data. The control module is further
configured to instruct the second vehicle to obtain and provide the image data. Embodiments
may enable image data provision from the outside of one vehicle by another vehicle.
[0008] Embodiments also provide an apparatus for a mobile transceiver of a mobile communication
system. Another embodiment is a mobile transceiver comprising an embodiment of the
apparatus. The apparatus comprises one or more interfaces, which are configured to
communicate in the mobile communication system. The apparatus further comprises a
control module, which is configured to control the one or more interfaces. The control
module is further configured to generate a request for obtaining image data of a first
vehicle and to forward the request to a second vehicle, which is capable of determining
such image data. The control module is further configured to instruct the second vehicle
to obtain and provide the image data. Embodiments may hence enable a mechanism to
obtain image data of a vehicle from other vehicles.
[0009] For example, the control module of the apparatus for monitoring the vehicle may be
further configured to request a vehicle in the vicinity of the vehicle to provide
image data on the vehicle. Embodiments may enable an efficient mechanism for a first
vehicle to find or identify a second vehicle in the vicinity, which is capable of
providing image data of the first vehicle. In further embodiments the control module
of the apparatus for monitoring the vehicle may be further configured to detect the
trigger event based on one or more elements of the group of information received from
a vehicle owner or a vehicle owner's mobile transceiver, information received from
an application server, information received from another vehicle, and sensor data
from the vehicle. Embodiments may hence enable different services, triggered by different
network components. Such a request may be issued upon detection of the trigger event
and/or upon request from another entity. Other entities, e.g. another vehicle, an
application server, or an owner's device, may request to be provided with image data
of the vehicle. Embodiments may enable an efficient image data detection mechanism.
In further embodiments the control module of the apparatus for monitoring the vehicle
may be further configured to determine the vehicle in the vicinity by communicating
with the application server through the mobile communication system. Embodiments may
enable efficient communication to obtain the image data.
[0010] In some embodiments the control module of the apparatus for monitoring the vehicle
is further configured to determine the vehicle in the vicinity by broadcasting a request
using direct communication through the mobile communication system. Embodiments may
enable car-to-car (C2C) or vehicle-to-vehicle (V2V), e.g. 3
rd Generation Partnership Project (3GPP) V2V, communication to determine capable vehicles
in the vicinity. In other embodiments the control module of the apparatus for monitoring
the vehicle may be configured to communicate a request to provide the image data of
the vehicle, to communicate with the application server, and/or to communicate with
an owner of the vehicle, his mobile transceiver, respectively, using the one or more
interfaces. Embodiments may enable different mechanisms to communicate a request for
image data and the image data itself to an owner of the vehicle.
[0011] In embodiments, at the application server the control module may be further configured
to receive the image data from the second vehicle and to provide the image data to
an owner of the first vehicle, his mobile transceiver, respectively. At the mobile
transceiver the control module may be configured to receive the image data using the
communication module. Such image data may be received from the application server,
from the (first) vehicle itself, or from another (second) vehicle in the vicinity
of the (first) vehicle directly. Embodiments may enable different communication paths
to communicate trigger information, request information and/or the image data. Additionally
or alternatively, the control module for the application server may be further configured
to instruct the second vehicle to provide the image data to the owner of the first
vehicle. In some embodiments, the control module for the mobile transceiver may be
configured to instruct the second vehicle (directly, via the application server, or
via the first vehicle) to provide the image data to the owner of the first vehicle.
[0012] In embodiments the application server apparatus may trigger different communication
paths for the image data. For example, in an embodiment the application server apparatus
further comprises a data base, which is configured to store information on one or
more vehicles and their locations. The control module of the application server apparatus
may be further configured to determine the second vehicle in the vicinity of the first
vehicle using the data base. Embodiments may enable a quick determination of a second
vehicle through use of the data base. In another embodiment the control module at
the application server apparatus may be further configured to receive information
on a trigger event from the vehicle. Embodiments may enable triggering by the vehicle
itself to be monitored. The control module may be further configured to automatically
obtain or generate the request upon reception of the information on the trigger event.
In some embodiments information on the trigger event may be forwarded to the mobile
transceiver apparatus of the vehicle's owner (directly or via the application server
apparatus), where the request may be generated.
[0013] A further embodiment is a method for monitoring a vehicle. The method comprises determining
a parking situation of the vehicle and detecting a trigger event. The method further
comprises transmitting a message comprising information on the trigger event to a
predefined device in case the trigger event is detected.
[0014] Another embodiment is a method for an application server, which is configured to
communicate through a mobile communication network. The method comprises obtaining
a request for obtaining image data of a first vehicle, and determining a second vehicle,
which is capable of determining such image data. The method further comprises instructing
the second vehicle to obtain and provide the image data.
[0015] Yet another embodiment is a method for a mobile transceiver of a mobile communication
system. The method comprises generating a request for obtaining image data of a first
vehicle and forwarding the request to a second vehicle, which is capable of determining
such image data. The method further comprises instructing the second vehicle to obtain
and provide the image data.
[0016] Other embodiments are a system comprising the above described apparatuses and a method
for a system comprising the above described methods.
[0017] Embodiments further provide a computer program having a program code for performing
one or more of the above described methods, when the computer program is executed
on a computer, processor, or programmable hardware component. A further embodiment
is a computer readable storage medium storing instructions which, when executed by
a computer, processor, or programmable hardware component, cause the computer to implement
one of the methods described herein.
[0018] Some other features or aspects will be described using the following non-limiting
embodiments of apparatuses or methods or computer programs or computer program products
by way of example only, and with reference to the accompanying figures, in which:
Fig. 1 illustrates an embodiment of an apparatus for monitoring vehicle, an embodiment
of a vehicle, an embodiment of an apparatus for an application server, an embodiment
of an application server, an embodiment of an apparatus for a mobile transceiver,
an embodiment of a mobile transceiver, and an embodiment of a system;
Fig. 2 shows a communication scenario in an embodiment;
Fig. 3 illustrates a communication sequence in an embodiment;
Fig. 4 shows another communication sequence in an embodiment;
Fig. 5 shows a block diagram of a flow chart of an embodiment of a method for monitoring
a vehicle;
Fig. 6 shows a block diagram of a flow chart of an embodiment of a method for an application
server; and
Fig. 7 shows a block diagram of a flow chart of an embodiment of a method for a mobile
transceiver.
[0019] Various example embodiments will now be described more fully with reference to the
accompanying drawings in which some example embodiments are illustrated. In the figures,
the thicknesses of lines, layers or regions may be exaggerated for clarity. Optional
components may be illustrated using broken, dashed or dotted lines.
[0020] Accordingly, while example embodiments are capable of various modifications and alternative
forms, embodiments thereof are shown by way of example in the figures and will herein
be described in detail. It should be understood, however, that there is no intent
to limit example embodiments to the particular forms disclosed, but on the contrary,
example embodiments are to cover all modifications, equivalents, and alternatives
falling within the scope of the invention. Like numbers refer to like or similar elements
throughout the description of the figures.
[0021] As used herein, the term, "or" refers to a non-exclusive or, unless otherwise indicated
(e.g., "or else" or "or in the alternative"). Furthermore, as used herein, words used
to describe a relationship between elements should be broadly construed to include
a direct relationship or the presence of intervening elements unless otherwise indicated.
For example, when an element is referred to as being "connected" or "coupled" to another
element, the element may be directly connected or coupled to the other element or
intervening elements may be present. In contrast, when an element is referred to as
being "directly connected" or "directly coupled" to another element, there are no
intervening elements present. Similarly, words such as "between", "adjacent", and
the like should be interpreted in a like fashion.
[0022] The terminology used herein is for the purpose of describing particular embodiments
only and is not intended to be limiting of example embodiments. As used herein, the
singular forms "a", "an" and "the" are intended to include the plural forms as well,
unless the context clearly indicates otherwise. It will be further understood that
the terms "comprises", "comprising", "includes" or "including", when used herein,
specify the presence of stated features, integers, steps, operations, elements or
components, but do not preclude the presence or addition of one or more other features,
integers, steps, operations, elements, components or groups thereof.
[0023] Unless otherwise defined, all terms (including technical and scientific terms) used
herein have the same meaning as commonly understood by one of ordinary skill in the
art to which example embodiments belong. It will be further understood that terms,
e.g., those defined in commonly used dictionaries, should be interpreted as having
a meaning that is consistent with their meaning in the context of the relevant art
and will not be interpreted in an idealized or overly formal sense unless expressly
so defined herein.
[0024] Fig. 1 illustrates an embodiment of an apparatus 10 for monitoring a vehicle 100.
The apparatus 10 is configured to, adapted to or suitable to be used in the vehicle
100. The vehicle 100 is shown in Fig. 1 as optional (broken line). However, another
embodiment is a vehicle 100 comprising an embodiment of the apparatus 10. The apparatus
10 comprises one or more interfaces 12, which are configured to communicate in a mobile
communication system 400. The apparatus 10 further comprises a control module 14,
which is coupled to the one or more interfaces 12, and which is configured to control
one or more interfaces 12. The control module 14 is further configured to determine
a parking situation of the vehicle 100. The control module 14 is further configured
to detect a trigger event, and to transmit a message comprising information on the
trigger event to a predefined device in case the trigger event is detected.
[0025] Fig. 1 further illustrates an embodiment of an apparatus 20 for an application server
200 being configured to communicate through a mobile communication network 400. The
apparatus 20 is configured to, adapted to or suitable to be used in the application
server 200. The application server 200 is shown in Fig. 1 as optional (broken line).
However, another embodiment is an application server 200 comprising an embodiment
of the apparatus 20. The apparatus 20 comprises one or more interfaces 22 configured
to communicate in the mobile communication system 400. The apparatus 20 further comprises
a control module 24, which is coupled to the one or more interfaces 22. The control
module 24 is configured to control the one or more interfaces 22. The control module
24 is further configured to obtain a request for obtaining image data of a first vehicle
100, and to determine a second vehicle 102, which is capable of determining such image
data. The control module 24 is further configured to instruct the second vehicle 102
to obtain and provide the image data.
[0026] Fig. 1 also illustrates an apparatus 30 for a mobile transceiver 300 of a mobile
communication system 400. Another embodiment is a mobile transceiver 300 comprising
an embodiment of the apparatus 30. The apparatus 30 comprises one or more interfaces
32 configured to communicate in the mobile communication system 400. The apparatus
30 further comprises a control module 34, which is coupled to the one or more interfaces
32. The control module 34 is configured to control the one or more interfaces 32.
The control module 34 is further configured to generate a request for obtaining image
data of a first vehicle 100, and to forward the request to a second vehicle 102, which
is capable of determining such image data. The control module 34 is further configured
to instruct the second vehicle 102 to obtain and provide the image data.
[0027] In embodiments different communication sequences are possible. Fig. 1 also illustrates
an embodiment of a system 400 comprising embodiments of the apparatuses 10, 20 and
30. For example, an owner of an embodiment of a mobile transceiver 300 with the apparatus
30 may request to see an outside image of his vehicle 100. The control module 34 of
the apparatus 30 may further run an accordingly adapted application, which lets the
owner input an according trigger for retrieving the outside image of his vehicle 100.
The mobile device or transceiver 300 may now determine and instruct another vehicle
102 to record or acquire such image data to be communicated and displayed at the mobile
transceiver 300. There are different communication sequences and mechanisms conceivable
in embodiments.
[0028] For example, the mobile transceiver 300 may communicate with the first vehicle 100,
which may be capable of identifying the second vehicle 102 itself, e.g. by means of
using local communication, sensor data, etc. The mobile transceiver 300 may determine
the second vehicle 102 without the help of the first vehicle 100 in some embodiments.
E.g. the mobile transceiver may start a query based on the location of the first vehicle
100, which can be predetermined, provided by the first vehicle 100, determined by
the mobile communication system 400, etc. Such a query may be run through an application
server 200, which will be detailed subsequently. Accordingly, the image data from
the second vehicle 102 may be communicated via the first vehicle 100, via the application
server 200, via other vehicles or mobile transceivers, to be finally displayed at
the mobile transceiver 300.
[0029] In other embodiments the trigger may be generated by the parked vehicle 100, e.g.
based on a detected shock, sensor data, an observation of another vehicle, etc. Again
the second vehicle 102 can be determined by different mechanisms as laid out above.
In even further embodiments the trigger event may be determined by the application
server 200, e.g. upon request from the owner (mobile transceiver 300 or any other
owner's device), as will be presented in more detail subsequently.
[0030] In embodiments the one or more interfaces 12, 22, 32 may correspond to any means
for obtaining, receiving, transmitting or providing analog or digital signals or information,
e.g. any connector, contact, pin, register, input port, output port, conductor, lane,
etc. which allows providing or obtaining a signal or information. An interface may
be wireless or wireline and it may be configured to communicate, i.e. transmit or
receive signals, information with further internal or external components. The one
or more interfaces 12, 22, 32 may comprise further components to enable according
communication in the mobile communication system 400, such components may include
transceiver (transmitter and/or receiver) components, such as one or more Low-Noise
Amplifiers (LNAs), one or more Power-Amplifiers (PAs), one or more duplexers, one
or more diplexers, one or more filters or filter circuitry, one or more converters,
one or more mixers, accordingly adapted radio frequency components, etc. The one or
more interfaces 12, 22, 32 may be coupled to one or more antennas, which may correspond
to any transmit and/or receive antennas, such as horn antennas, dipole antennas, patch
antennas, sector antennas etc. The antennas may be arranged in a defined geometrical
setting, such as a uniform array, a linear array, a circular array, a triangular array,
a uniform field antenna, a field array, combinations thereof, etc. In some examples
the one or more interfaces 12, 22, 32 may serve the purpose of transmitting or receiving
or both, transmitting and receiving, information, such as information related to capabilities,
application requirements, trigger indications, requests, message interface configurations,
feedback, information related to control commands etc.
[0031] As shown in Fig. 1 the respective one or more interfaces 12, 22, 32 are coupled to
the respective control modules 14, 24, 34 at the apparatuses 10, 20, 30. In embodiments
the control modules 14, 24, 34 may be implemented using one or more processing units,
one or more processing devices, any means for processing, such as a processor, a computer
or a programmable hardware component being operable with accordingly adapted software.
In other words, the described functions of the control modules 14, 24, 34 may as well
be implemented in software, which is then executed on one or more programmable hardware
components. Such hardware components may comprise a general purpose processor, a Digital
Signal Processor (DSP), a micro-controller, etc.
[0032] Fig. 1 also shows an embodiment of a system 400 comprising embodiments the vehicles
100, 102 (mobile or relay transceivers), a mobile transceiver 300, and the application
server 200, which may correspond to a network controller/server or base station, respectively.
In embodiments, communication, i.e. transmission, reception or both, may take place
among mobile transceivers/vehicles 100, 102 directly and/or between mobile transceivers/vehicles
100, 102 and a network component (infrastructure or mobile transceiver) or application
server 200 (e.g. a base station, a network server, a backend server, etc.). Such communication
may make use of a mobile communication system 400. Such communication may be carried
out directly, e.g. by means of Device-to-Device (D2D) communication, which may also
comprise Vehicle-to-Vehicle (V2V) or car-to-car communication in case of vehicles
100, 102. Such communication may be carried out using the specifications of a mobile
communication system 400.
[0033] The mobile communication system 400 may, for example, correspond to one of the Third
Generation Partnership Project (3GPP)-standardized mobile communication networks,
where the term mobile communication system is used synonymously to mobile communication
network. The mobile or wireless communication system 400 may correspond to a mobile
communication system of the 5th Generation (5G) and may use mm-Wave technology. The
mobile communication system may correspond to or comprise, for example, a Long-Term
Evolution (LTE), an LTE-Advanced (LTE-A), High Speed Packet Access (HSPA), a Universal
Mobile Telecommunication System (UMTS) or a UMTS Terrestrial Radio Access Network
(UTRAN), an evolved-UTRAN (e-UTRAN), a Global System for Mobile communication (GSM)
or Enhanced Data rates for GSM Evolution (EDGE) network, a GSM/EDGE Radio Access Network
(GERAN), or mobile communication networks with different standards, for example, a
Worldwide Inter-operability for Microwave Access (WIMAX) network IEEE 802.16 or Wireless
Local Area Network (WLAN) IEEE 802.11, generally an Orthogonal Frequency Division
Multiple Access (OFDMA) network, a Time Division Multiple Access (TDMA) network, a
Code Division Multiple Access (CDMA) network, a Wideband-CDMA (WCDMA) network, a Frequency
Division Multiple Access (FDMA) network, a Spatial Division Multiple Access (SDMA)
network, etc.
[0034] A base station transceiver can be operable or configured to communicate with one
or more active mobile transceivers/vehicles 100, 102, 300 and a base station transceiver
can be located in or adjacent to a coverage area of another base station transceiver,
e.g. a macro cell base station transceiver or small cell base station transceiver.
Hence, embodiments may provide a mobile communication system 400 comprising two or
more mobile transceivers/vehicles 100, 102, 300 and one or more base station transceivers,
wherein the base station transceivers may establish macro cells or small cells, as
e.g. pico-, metro-, or femto cells. A mobile transceiver may correspond to a smartphone,
a cell phone, user equipment, a laptop, a notebook, a personal computer, a Personal
Digital Assistant (PDA), a Universal Serial Bus (USB) -stick, a car, a vehicle etc.
A mobile transceiver may also be referred to as User Equipment (UE) or mobile in line
with the 3GPP terminology. A vehicle 100, 102 may correspond to any conceivable means
for transportation, e.g. a car, a bike, a motorbike, a van, a truck, a bus, a ship,
a boat, a plane, a train, a tram, etc.
[0035] A base station transceiver can be located in the fixed or stationary part of the
network or system. A base station transceiver may correspond to a remote radio head,
a transmission point, an access point, a macro cell, a small cell, a micro cell, a
femto cell, a metro cell etc. A base station transceiver can be a wireless interface
of a wired network, which enables transmission of radio signals to a UE or mobile
transceiver. Such a radio signal may comply with radio signals as, for example, standardized
by 3GPP or, generally, in line with one or more of the above listed systems. Thus,
a base station transceiver may correspond to a NodeB, an eNodeB, a Base Transceiver
Station (BTS), an access point, a remote radio head, a relay station, a transmission
point etc., which may be further subdivided in a remote unit and a central unit.
[0036] A mobile transceiver 100, 102, 300 can be associated with a base station transceiver
or cell. The term cell refers to a coverage area of radio services provided by a base
station transceiver, e.g. a NodeB (NB), an eNodeB (eNB), a remote radio head, a transmission
point, etc. A base station transceiver may operate one or more cells on one or more
frequency layers, in some embodiments a cell may correspond to a sector. For example,
sectors can be achieved using sector antennas, which provide a characteristic for
covering an angular section around a remote unit or base station transceiver. In some
embodiments, a base station transceiver may, for example, operate three or six cells
covering sectors of 120° (in case of three cells), 60° (in case of six cells) respectively.
A base station transceiver may operate multiple sectorized antennas. In the following
a cell may represent an according base station transceiver generating the cell or,
likewise, a base station transceiver may represent a cell the base station transceiver
generates.
[0037] Mobile transceivers 100, 102, 300 may communicate directly with each other, i.e.
without involving any base station transceiver, which is also referred to as Device-to-Device
(D2D) communication. An example of D2D is direct communication between vehicles, also
referred to as Vehicle-to-Vehicle communication (V2V), car-to-car using 802.11 p,
respectively. In embodiments the one or more interfaces 12, 22, 32 can be configured
to use this kind of communication. In order to do so radio resources are used, e.g.
frequency, time, code, and/or spatial resources, which may as well be used for wireless
communication with a base station transceiver. The assignment of the radio resources
may be controlled by a base station transceiver, i.e. the determination which resources
are used for D2D and which are not. Here and in the following radio resources of the
respective components may correspond to any radio resources conceivable on radio carriers
and they may use the same or different granularities on the respective carriers. The
radio resources may correspond to a Resource Block (RB as in LTE/LTE-A/LTE-unlicensed
(LTE-U)), one or more carriers, sub-carriers, one or more radio frames, radio sub-frames,
radio slots, one or more code sequences potentially with a respective spreading factor,
one or more spatial resources, such as spatial sub-channels, spatial precoding vectors,
any combination thereof, etc.
[0038] For example, in direct Cellular Vehicle-to-Anything (C-V2X), where V2X includes at
least V2V, V2- Infrastructure (V2I), etc., transmission according to 3GPP Release
14 onward can be managed by infrastructure (so-called mode 3) or run in a User Equipment
(UE) Autonomous mode (UEA), (so-called mode 4). In embodiments the two or more mobile
transceivers 100, 102, 300 as indicated by Fig.1 may be registered in the same mobile
communication system 400. In other embodiments one or more of the mobile transceivers
100, 102, 300 may be registered in different mobile communication systems 400. The
different mobile communication systems 400 may use the same access technology but
different operators or they may use different access technologies as outlined above.
[0039] In another embodiment a status of a parked vehicle 100 is monitored. The request
for the vehicle status can have several reasons (curiosity of the owner or damage
of the vehicle, for instance) and triggers, which could be initiated either on a regular
basis or based on an event (storm, hail, accident, etc.). The wireless communication
system 400 offers the possibility to reach the vehicle 100 remotely and read/recognize
the status data of the vehicle 100. Status data may refer to the appearance of the
vehicle 100 or the relation between the (target) vehicle 100 and its vicinity. For
instance, an owner of the vehicle 100 would like to see the vehicle to check the condition
of the vehicle body or the surrounding after unpleasant nature event (e.g. storm or
hail). This kind of status information may be measured or be seen by external observers
or vehicles 102. Embodiments may enable a service to get these observations (photos
or videos) of a target vehicle 100 by other vehicles 102, which are equipped with
cameras and a communication unit.
[0040] In embodiments a vehicle status may be interesting in many scenarios. In one scenario,
the owner, monitoring or application server 200 initiates a status request (regular
or event dependent) and in another scenario the vehicle 100 itself may initiate the
status monitoring (regular or event dependent). In yet another scenario another vehicle
may generate the event, e.g. a vehicle that just had contact with vehicle 100 or which
witnessed contact between any object and vehicle 100. The status report, or message
comprising information on the trigger event from the vehicle 100, might be simple
information, such as "I am ok", for instance, a photo (of the area around the vehicle
100 or of the vehicle 100 itself) or a detailed report based on predefined parameters.
[0041] Fig. 2 illustrates a communication scenario in an embodiment. Fig. 2 shows an embodiment
of a vehicle 100 comprising an embodiment of the above apparatus 10. The vehicle 100
may have multiple access possibilities or technologies, e.g. interface 12 may be a
3GPP RAN or WIFI to access a mobile communication system 400. The mobile communication
system 400 comprises one or more operator core networks, and connects, via Internet
in the embodiment shown in Fig. 2, to an embodiment of an application server 200.
User equipment 300 may be connected to the mobile communication system 400 via another
operator core network and another RAN. For example, the predefined device, to transmit
the message from the vehicle 100 to, may be the application server 200 or the user
equipment 300.
[0042] In the Fig. 2, a simple embodiment is illustrated. The target vehicle 100 is damaged
and requests a status report process to inform the owner 300 about this situation.
As the vehicle 100 is connected to the network via 3GPP RAN or WiFi (via e.g. non-3GPP
Interworking Network Function) for instance, this request is sent to either the application
server 200 or the vehicle owner 300, which or who initiate a status monitoring process.
For example, the control module 14 at the apparatus 10 may be further configured to
detect the trigger event based on one or more elements of the group of information
received from a vehicle owner, information received from an application server 200,
information received from another vehicle 102, and sensor data from the vehicle 100.
For example, another vehicle 102 may monitor an accident and inform the vehicle 100
thereby triggering the event. As further illustrated by Fig. 2 the vehicle 100 is
reachable via mobile device 300 (or via server 200). The vehicle 100 may detect an
unusual situation and may want to know the status of its vehicle body. For example,
the control module 14 of the monitoring apparatus 10 is further configured to request
a vehicle 102 in the vicinity of the vehicle 100 to provide image data on the vehicle
100. The control module 14 may be further configured to request the image data upon
detection of the trigger event (e.g. an accident or shaking of the vehicle) and/or
upon request (e.g. from an owner of the vehicle 100, another vehicle, or an application
server 200).
[0043] The next steps are performed in some embodiments as described in the following alternative
methods/procedures:
- 1) The application server 200 (owner) may request the network 400
- to localize a vehicle 102 (e.g. the mobile communication system's 400 core network
or radio access network may provide a localization service), which is equipped with
the required measurement devices (in this case a camera), in the proximity of the
target vehicle 100,
- to request this vehicle to perform the corresponding measurement (in this case taking
a photo/video), and
- to send a photo to the owner (the owner's device 300, respectively) or application
server 200.
[0044] For example, the control module 14 is configured to determine the vehicle 102 in
the vicinity by communicating with an application server 200 through the mobile communication
system 400. The application server 200 may keep track of vehicles and may hence have
information on potential vehicles in the vicinity of vehicle 100 available. In some
embodiments the application server 200 may be configured to determine the vehicle
102 in the vicinity of vehicle 100 on demand or upon request. At the mobile transceiver
300 the control module 34 can be configured to trigger the event, e.g. based on weather
conditions, and start a query for the vehicle 102. For example, vehicles 102 or other
mobile transceivers may let the mobile transceiver 300 know whether they are capable
of fulfilling the request and provide the requested data to the mobile transceiver
300, e.g. via the vehicle 100 and/or the application server 200.
[0045] The control module 14 may be further configured to determine the vehicle 102 in the
vicinity by broadcasting a request using direct communication through the mobile communication
system 400. Hence, in some embodiments the vehicle 100 may broadcast such a request
to vehicles in its vicinity. Generally in embodiments, the control module 14 may be
configured to communicate a request to provide the image data of the vehicle 100,
to communicate with the application server 200, and/or to communicate with an owner
of the vehicle 100 using the one or more interfaces 12. The control module 24 at the
application server apparatus 20 may be configured to receive the image data from the
second vehicle 102 and to provide the image data to an owner (e.g. the mobile transceiver
300 of the owner) of the first vehicle 100. The control module 24 may further be configured
to instruct the second vehicle 102 to provide the image data to the owner 300 of the
first vehicle 100.
[0046] Additionally or alternatively, the control module 34 of the mobile transceiver apparatus
30 may be configured to determine and instruct the second vehicle 102 as explained
above.
[0047] The control module 24, additionally or alternatively the control module 34, may be
further configured to receive information on a trigger event from the vehicle 100.
The control module 24, 34 may be further configured to obtain the request upon reception
of the information on the trigger event.
[0048] The application server 200, which may potentially be operated by a vehicle manufacturer
or a service provider, may possess a data base of one or more vehicles and their positions/locations,
- contacts the vehicles in the proximity of the target vehicle 100 via a wireless connection,
- requests the vehicles 102 to perform the corresponding measurements (photo/video)
of the target vehicle 100, and
- requests the vehicles 102 to send this measurement.
[0049] The application server apparatus 20 may further comprise a data base 26, which is
configured to store information on one or more vehicles and their locations. The control
module 24 may be further configured to determine the second vehicle 102 in the vicinity
of the first vehicle 100 using the data base 26.
[0050] Fig. 3 illustrates a communication sequence in an embodiment. Fig. 3 illustrates
vehicle 100, which after being hit by another vehicle requests image data on its body.
Similar to what was already described with respect to Fig. 2, the vehicle 100 is connected
to a mobile communication system 400 via one or more RANs, potentially being interconnected
by the internet. An application server 200 and user equipment 300 may as well be connected
to the network 400. The request for image data may hence be communicated from the
vehicle 100 to the application server 200. The application server 200 may request
to take a photo or the status of vehicle 100 from vehicle 102, which is located in
the vicinity of vehicle 100. Vehicle 102 may then take a photo of vehicle 100 and
send this photo to the requester (application server 200, vehicle 100, or owner UE
300).
[0051] The target vehicle 102 may request the application server 200 (or owner 300), which
is operated by a vehicle manufacture or a service provider and which may possess a
data base of vehicles and their positions/locations, to contact the vehicles in the
proximity of the target vehicle 100 via a wireless connection, and to
- request a vehicle 102 (or several vehicles 102) to perform the corresponding measurements,
and to
- request to send this measurement.
[0052] The target vehicle 102 may request the vehicles in the proximity of the target vehicle
102 via a wireless connection to perform the corresponding measurements and may request
the vehicles to send this measurement (either to the target vehicle 100, to an application
server 200, or to an owner UE 300, for instance).
[0053] Fig. 4 shows another communication sequence in an embodiment. Fig. 4 shows similar
components as already described with the help of Figs. 2 and 3. In this embodiment
vehicle 100 requests vehicle 102, e.g. by means of using a broadcast message, to take
a photo/detect status and provide the image data to the requester.
[0054] Fig. 5 shows a block diagram of a flow chart of an embodiment of a method 40 for
monitoring a vehicle 100. The method 40 for monitoring the vehicle 100 comprises determining
42 a parking situation of the vehicle 100, detecting 44 a trigger event, and transmitting
46 a message comprising information on the trigger event to a predefined device in
case the trigger event is detected.
[0055] Fig. 6 shows a block diagram of a flow chart of an embodiment of a method 50 for
an application server 200. The method 50 for the application server 200 is configured
to communicate through a mobile communication network 400. The method 50 comprises
obtaining 52 a request for obtaining image data of a first vehicle 100, determining
54 a second vehicle 102, which is capable of determining such image data, and instructing
56 the second vehicle 102 to obtain and provide the image data.
[0056] Fig. 7 shows a block diagram of a flow chart of an embodiment of a method 60 for
a mobile transceiver 300 of a mobile communication system 400. The method 60 comprises
generating 62 a request for obtaining image data of a first vehicle 100. The method
60 further comprises forwarding 64 the request to a second vehicle 102, which is capable
of determining such image data, and instructing 66 the second vehicle 102 to obtain
and provide the image data.
[0057] As already mentioned, in embodiments the respective methods may be implemented as
computer programs or codes, which can be executed on a respective hardware. Hence,
another embodiment is a computer program having a program code for performing at least
one of the above methods, when the computer program is executed on a computer, a processor,
or a programmable hardware component. A further embodiment is a (non-transitory) computer
readable storage medium storing instructions which, when executed by a computer, processor,
or programmable hardware component, cause the computer to implement one of the methods
described herein.
[0058] A person of skill in the art would readily recognize that steps of various above-described
methods can be performed by programmed computers, for example, positions of slots
may be determined or calculated. Herein, some embodiments are also intended to cover
program storage devices, e.g., digital data storage media, which are machine or computer
readable and encode machine-executable or computer-executable programs of instructions
where said instructions perform some or all of the steps of methods described herein.
The program storage devices may be, e.g., digital memories, magnetic storage media
such as magnetic disks and magnetic tapes, hard drives, or optically readable digital
data storage media. The embodiments are also intended to cover computers programmed
to perform said steps of methods described herein or (field) programmable logic arrays
((F)PLAs) or (field) programmable gate arrays ((F)PGAs), programmed to perform said
steps of the above-described methods.
[0059] The description and drawings merely illustrate the principles of the invention. It
will thus be appreciated that those skilled in the art will be able to devise various
arrangements that, although not explicitly described or shown herein, embody the principles
of the invention and are included within its spirit and scope. Furthermore, all examples
recited herein are principally intended expressly to be only for pedagogical purposes
to aid the reader in understanding the principles of the invention and the concepts
contributed by the inventor(s) to furthering the art, and are to be construed as being
without limitation to such specifically recited examples and conditions. Moreover,
all statements herein reciting principles, aspects, and embodiments of the invention,
as well as specific examples thereof, are intended to encompass equivalents thereof.
When provided by a processor, the functions may be provided by a single dedicated
processor, by a single shared processor, or by a plurality of individual processors,
some of which may be shared. Moreover, explicit use of the term "processor" or "controller"
should not be construed to refer exclusively to hardware capable of executing software,
and may implicitly include, without limitation, Digital Signal Processor (DSP) hardware,
network processor, application specific integrated circuit (ASIC), field programmable
gate array (FPGA), read only memory (ROM) for storing software, random access memory
(RAM), and non-volatile storage. Other hardware, conventional or custom, may also
be included. Their function may be carried out through the operation of program logic,
through dedicated logic, through the interaction of program control and dedicated
logic, or even manually, the particular technique being selectable by the implementer
as more specifically understood from the context.
[0060] It should be appreciated by those skilled in the art that any block diagrams herein
represent conceptual views of illustrative circuitry embodying the principles of the
invention. Similarly, it will be appreciated that any flow charts, flow diagrams,
state transition diagrams, pseudo code, and the like represent various processes which
may be substantially represented in computer readable medium and so executed by a
computer or processor, whether or not such computer or processor is explicitly shown.
[0061] Furthermore, the following claims are hereby incorporated into the detailed description,
where each claim may stand on its own as a separate embodiment. While each claim may
stand on its own as a separate embodiment, it is to be noted that - although a dependent
claim may refer in the claims to a specific combination with one or more other claims
- other embodiments may also include a combination of the dependent claim with the
subject matter of each other dependent claim. Such combinations are proposed herein
unless it is stated that a specific combination is not intended. Furthermore, it is
intended to include also features of a claim to any other independent claim even if
this claim is not directly made dependent to the independent claim.
[0062] It is further to be noted that methods disclosed in the specification or in the claims
may be implemented by a device having means for performing each of the respective
steps of these methods.
List of reference signs
[0063]
- 10
- apparatus for monitoring a vehicle
- 12
- one or more interfaces
- 14
- control module
- 20
- apparatus for an application server
- 22
- one or more interfaces
- 24
- control module
- 30
- apparatus for a mobile transceiver
- 32
- one or more interfaces
- 34
- control module
- 40
- method for monitoring a vehicle
- 42
- determining a parking situation of the vehicle
- 44
- detecting a trigger event
- 46
- transmitting a message comprising information on the trigger event to a predefined
device in case the trigger event is detected
- 50
- method for an application server
- 52
- obtaining a request for obtaining image data of a first vehicle
- 54
- determining a second vehicle, which is capable of determining such image data
- 56
- instructing the second vehicle to obtain and provide the image data
- 60
- method for a mobile transceiver
- 62
- generating a request for obtaining image data of a first vehicle
- 64
- forwarding the request to a second vehicle, which is capable of determining such image
data
- 66
- instructing the second vehicle to obtain and provide the image data
- 100
- vehicle
- 102
- vehicle in the vicinity
- 200
- application server
- 300
- mobile transceiver
- 400
- mobile communication system
1. An apparatus (10) for monitoring a vehicle (100), the apparatus (10) comprising
one or more interfaces (12) configured to communicate in a mobile communication system
(400); and
a control module (14) configured to control one or more interfaces (12), wherein the
control module (14) is further configured to
determine a parking situation of the vehicle (100);
detect a trigger event; and
transmit a message comprising information on the trigger event to a predefined device
in case the trigger event is detected.
2. The apparatus (10) of claim 1, wherein the control module (14) is further configured
to request a vehicle (102) in the vicinity of the vehicle (100) to provide image data
on the vehicle (100).
3. The apparatus (10) of one of the claims 1 or 2, wherein the predefined device is an
application server (200) or user equipment (300) and wherein the control module (14)
is further configured to detect the trigger event based on one or more elements of
the group of information received from a vehicle owner (300), information received
from an application server (200), information received from another vehicle (102),
and sensor data from the vehicle (100).
4. The apparatus (10) of claim 3, wherein the control module (14) is further configured
to request the image data upon detection of the trigger event and/or upon request
from another entity.
5. The apparatus (10) of claim 2, wherein the control module (14) is further configured
to determine the vehicle (102) in the vicinity by communicating with an application
server (200) through the mobile communication system (400), or by broadcasting a request
using direct communication through the mobile communication system (400).
6. The apparatus (10) of one of the claims 4 or 5, wherein the control module (14) is
configured to communicate a request to provide the image data of the vehicle (100),
to communicate with the application server (200), and/or to communicate with an owner
of the vehicle (100) using the one or more interfaces (12).
7. An apparatus (20) for an application server (200) being configured to communicate
through a mobile communication network (400), the apparatus (20) comprising
one or more interfaces (22) configured to communicate in the mobile communication
system (400); and
a control module (24) configured to control the one or more interfaces (22), wherein
the control module (24) is further configured to
obtain a request for obtaining image data of a first vehicle (100),
determine a second vehicle (102), which is capable of determining such image data,
and instruct the second vehicle (102) to obtain and provide the image data.
8. The apparatus (20) of claim 7, wherein the control module (24) is further configured
to receive the image data from the second vehicle (102) and to provide the image data
to an owner of the first vehicle (100), or wherein the control module (24) is further
configured to instruct the second vehicle (102) to provide the image data to the owner
of the first vehicle (100).
9. The apparatus (20) of one of the claims 7 or 8, further comprising a data base (26),
which is configured to store information on one or more vehicles and their locations,
wherein the control module (24) is further configured to determine the second vehicle
(102) in the vicinity of the first vehicle (100) using the data base (26).
10. The apparatus (20) of one of the claims 7 to 9, wherein the control module (24) is
further configured to receive information on a trigger event from the first vehicle
(100), wherein the control module (24) is further configured to obtain the request
upon reception of the information on the trigger event.
11. An apparatus (30) for a mobile transceiver (300) of a mobile communication system
(400), the apparatus (30) comprising
one or more interfaces (32) configured to communicate in the mobile communication
system (400); and
a control module (34) configured to control the one or more interfaces (32), wherein
the control module (34) is further configured to
generate a request for obtaining image data of a first vehicle (100),
forward the request to a second vehicle (102), which is capable of determining such
image data, and
instruct the second vehicle (102) to obtain and provide the image data.
12. A method (40) for monitoring a vehicle (100), the method (30) comprising
determining (42) a parking situation of the vehicle (100);
detecting (44) a trigger event; and
transmitting (46) a message comprising information on the trigger event to a predefined
device in case the trigger event is detected.
13. A method (50) for an application server (200) being configured to communicate through
a mobile communication network (400), the method (50) comprising
obtaining (52) a request for obtaining image data of a first vehicle (100),
determining (54) a second vehicle (102), which is capable of determining such image
data, and
instructing (56) the second vehicle (102) to obtain and provide the image data.
14. A method (60) for a mobile transceiver (300) of a mobile communication system (400),
the method (60) comprising
generating (62) a request for obtaining image data of a first vehicle (100);
forwarding (64) the request to a second vehicle (102), which is capable of determining
such image data; and
instructing (66) the second vehicle (102) to obtain and provide the image data.
15. A computer program having a program code for performing at least one of the methods
of claims 12, 13 or 14, when the computer program is executed on a computer, a processor,
or a programmable hardware component.
Amended claims in accordance with Rule 137(2) EPC.
1. An apparatus (10) for monitoring a vehicle (100), the apparatus (10) comprising
one or more interfaces (12) configured to communicate in a mobile communication system
(400); and
a control module (14) configured to control one or more interfaces (12), wherein the
control module (14) is further configured to
determine a parking situation of the vehicle (100);
detect a trigger event based on information received from a vehicle owner (300); and
transmit a message comprising information on the trigger event to a predefined device
in case the trigger event is detected,
wherein the predefined device is an application server (200) or user equipment (300).
request a vehicle (102) in the vicinity of the vehicle (100) to provide image data
on the vehicle (100) upon detection of the trigger event and/or upon request from
another entity,
wherein the control module (14) is further configured to determine the vehicle (102)
in the vicinity by communicating with an application server (200) through the mobile
communication system (400).
2. The apparatus (10) of claim 1, wherein the control module (14) is configured to communicate
a request to provide the image data of the vehicle (100), to communicate with the
application server (200), and/or to communicate with an owner of the vehicle (100)
using the one or more interfaces (12).
3. An apparatus (20) for an application server (200) being configured to communicate
through a mobile communication network (400), the apparatus (20) comprising one or
more interfaces (22) configured to communicate in the mobile communication system
(400); and
a control module (24) configured to control the one or more interfaces (22), wherein
the control module (24) is further configured to
obtain a request for obtaining image data of the first vehicle (100),
determine a second vehicle (102), which is capable of determining such image data,
and instruct the second vehicle (102) to obtain and provide the image data to a mobile
transceiver (300) of an owner of the first vehicle.
4. The apparatus (20) of claim 3, further comprising a data base (26), which is configured
to store information on one or more vehicles and their locations, wherein the control
module (24) is further configured to determine the second vehicle (102) in the vicinity
of the first vehicle (100) using the data base (26).
5. The apparatus (20) of one of the claims 3 or 4, wherein the control module (24) is
further configured to receive information on a trigger event from the first vehicle
(100), wherein the control module (24) is further configured to obtain the request
upon reception of the information on the trigger event.
6. An apparatus (30) for a mobile transceiver (300) of a mobile communication system
(400), the apparatus (30) comprising
one or more interfaces (32) configured to communicate in the mobile communication
system (400); and
a control module (34) configured to control the one or more interfaces (32), wherein
the control module (34) is further configured to
generate a request for obtaining image data of a first vehicle (100),
forward the request to a second vehicle (102), which is capable of determining such
image data, and
instruct the second vehicle (102) to obtain and provide the image data.
7. A method (40) for monitoring a vehicle (100), the method (30) comprising
determining (42) a parking situation of the vehicle (100);
detecting (44) a trigger event based on information received from a vehicle owner
(300); transmitting (46) a message comprising information on the trigger event to
a predefined device in case the trigger event is detected,
wherein the predefined device is an application server (200) or user equipment (300);
requesting a vehicle (102) in the vicinity of the vehicle (100) to provide image data
on the vehicle (100) upon detection of the trigger event and/or upon request from
another entity; and
determining the vehicle (102) in the vicinity by communicating with an application
server (200) through the mobile communication system (400).
8. A method (50) for an application server (200) being configured to communicate through
a mobile communication network (400), the method (50) comprising
obtaining (52) a request for obtaining image data of a first vehicle (100),
determining (54) a second vehicle (102), which is capable of determining such image
data, and
instructing (56) the second vehicle (102) to obtain and provide the image data to
a mobile transceiver (300) of an owner of the first vehicle (100).
9. A method (60) for a mobile transceiver (300) of a mobile communication system (400),
the method (60) comprising
generating (62) a request for obtaining image data of a first vehicle (100);
forwarding (64) the request to a second vehicle (102), which is capable of determining
such image data; and
instructing (66) the second vehicle (102) to obtain and provide the image data.
10. A computer program having a program code for performing at least one of the methods
of claims 7, 8 or 9, when the computer program is executed on a computer, a processor,
or a programmable hardware component.