Clarify use of forcing routing steps (#6866)
The change clarifies the conditions for forcing routing steps and simplifies the codebase to support it. - Makes explicity the search runtime condition for forcing a routing step. Namely, the node is a source of the forward and reverse searches, and it's one of the pre-identified nodes that requires a step to be forced. - Consolidate the two lists of force nodes into one. Not only is there no algorithmic value in separating the nodes by geometric direction, the improvements to via-routes with u-turns mean atleast one of these lists will be empty for any search. - Rename 'force loop' to 'force step'. This moves the code away from the original CH-specific language for checking for self-loops in the case where this condition is met. MLD does not have loops. Additional cucumber tests are added to cover the logic related to negative search weights and forcing routing steps on via-route paths.
This commit is contained in:
parent
70969186f6
commit
ffc39b8ad2
@ -46,6 +46,7 @@
|
|||||||
- FIXED: Correctly handle compressed traffic signals. [#6724](https://github.com/Project-OSRM/osrm-backend/pull/6724)
|
- FIXED: Correctly handle compressed traffic signals. [#6724](https://github.com/Project-OSRM/osrm-backend/pull/6724)
|
||||||
- FIXED: Fix bug when searching for maneuver overrides [#6739](https://github.com/Project-OSRM/osrm-backend/pull/6739)
|
- FIXED: Fix bug when searching for maneuver overrides [#6739](https://github.com/Project-OSRM/osrm-backend/pull/6739)
|
||||||
- FIXED: Remove force-loop checks for routes with u-turns [#6858](https://github.com/Project-OSRM/osrm-backend/pull/6858)
|
- FIXED: Remove force-loop checks for routes with u-turns [#6858](https://github.com/Project-OSRM/osrm-backend/pull/6858)
|
||||||
|
- FIXED: Correctly check runtime search conditions for forcing routing steps [#6866](https://github.com/Project-OSRM/osrm-backend/pull/6866)
|
||||||
- Debug tiles:
|
- Debug tiles:
|
||||||
- FIXED: Ensure speed layer features have unique ids. [#6726](https://github.com/Project-OSRM/osrm-backend/pull/6726)
|
- FIXED: Ensure speed layer features have unique ids. [#6726](https://github.com/Project-OSRM/osrm-backend/pull/6726)
|
||||||
|
|
||||||
|
101
features/testbot/force_step.feature
Normal file
101
features/testbot/force_step.feature
Normal file
@ -0,0 +1,101 @@
|
|||||||
|
@routing @testbot @via
|
||||||
|
Feature: Force routing steps
|
||||||
|
Background:
|
||||||
|
Given the profile "testbot"
|
||||||
|
|
||||||
|
Scenario: Direct routes with waypoints on same edge
|
||||||
|
Given the node map
|
||||||
|
"""
|
||||||
|
1 2
|
||||||
|
a-------b
|
||||||
|
| |
|
||||||
|
d-------c
|
||||||
|
| |
|
||||||
|
e-------f
|
||||||
|
3 4
|
||||||
|
"""
|
||||||
|
|
||||||
|
And the ways
|
||||||
|
| nodes | oneway |
|
||||||
|
| ab | no |
|
||||||
|
| ad | no |
|
||||||
|
| bc | no |
|
||||||
|
| cf | no |
|
||||||
|
| dc | no |
|
||||||
|
| de | no |
|
||||||
|
| ef | yes |
|
||||||
|
|
||||||
|
When I route I should get
|
||||||
|
| waypoints | approaches | weight | route |
|
||||||
|
| 1,2 | | 20 | ab,ab |
|
||||||
|
| 1,2 | curb curb | 100 | ab,ad,dc,bc,ab |
|
||||||
|
| 2,1 | | 20 | ab,ab |
|
||||||
|
| 2,1 | opposite opposite | 100 | ab,bc,dc,ad,ab |
|
||||||
|
| 3,4 | | 20 | ef,ef |
|
||||||
|
| 4,3 | | 100 | ef,cf,dc,de,ef |
|
||||||
|
|
||||||
|
Scenario: Via routes with waypoints on same edge
|
||||||
|
Given the node map
|
||||||
|
"""
|
||||||
|
1 2
|
||||||
|
a-------b
|
||||||
|
| |
|
||||||
|
d-5-----c
|
||||||
|
| |
|
||||||
|
e-------f
|
||||||
|
3 4
|
||||||
|
"""
|
||||||
|
|
||||||
|
And the ways
|
||||||
|
| nodes | oneway |
|
||||||
|
| ab | no |
|
||||||
|
| ad | no |
|
||||||
|
| bc | no |
|
||||||
|
| cf | no |
|
||||||
|
| dc | no |
|
||||||
|
| de | no |
|
||||||
|
| ef | yes |
|
||||||
|
|
||||||
|
When I route I should get
|
||||||
|
| waypoints | approaches | weight | route |
|
||||||
|
| 5,1,2 | | 59.8 | dc,ad,ab,ab,ab |
|
||||||
|
| 5,1,2 | unrestricted curb curb | 180.2 | dc,bc,ab,ab,ab,ad,dc,bc,ab |
|
||||||
|
| 5,2,1 | | 80.2 | dc,bc,ab,ab,ab |
|
||||||
|
| 5,2,1 | unrestricted opposite opposite | 159.8 | dc,ad,ab,ab,ab,bc,dc,ad,ab |
|
||||||
|
| 5,3,4 | | 59.8 | dc,de,ef,ef,ef |
|
||||||
|
| 5,4,3 | | 159.8 | dc,de,ef,ef,ef,cf,dc,de,ef |
|
||||||
|
|
||||||
|
|
||||||
|
Scenario: [U-turns allowed] Via routes with waypoints on same edge
|
||||||
|
Given the node map
|
||||||
|
"""
|
||||||
|
1 2
|
||||||
|
a-------b
|
||||||
|
| |
|
||||||
|
d-5-----c
|
||||||
|
| |
|
||||||
|
e-------f
|
||||||
|
3 4
|
||||||
|
"""
|
||||||
|
|
||||||
|
And the ways
|
||||||
|
| nodes | oneway |
|
||||||
|
| ab | no |
|
||||||
|
| ad | no |
|
||||||
|
| bc | no |
|
||||||
|
| cf | no |
|
||||||
|
| dc | no |
|
||||||
|
| de | no |
|
||||||
|
| ef | yes |
|
||||||
|
|
||||||
|
And the query options
|
||||||
|
| continue_straight | false |
|
||||||
|
|
||||||
|
When I route I should get
|
||||||
|
| waypoints | approaches | weight | route |
|
||||||
|
| 5,1,2 | | 59.8 | dc,ad,ab,ab,ab |
|
||||||
|
| 5,1,2 | unrestricted curb curb | 180.2 | dc,bc,ab,ab,ab,ad,dc,bc,ab |
|
||||||
|
| 5,2,1 | | 79.8 | dc,ad,ab,ab,ab,ab |
|
||||||
|
| 5,2,1 | unrestricted opposite opposite | 159.8 | dc,ad,ab,ab,ab,bc,dc,ad,ab |
|
||||||
|
| 5,3,4 | | 59.8 | dc,de,ef,ef,ef |
|
||||||
|
| 5,4,3 | | 159.8 | dc,de,ef,ef,ef,cf,dc,de,ef |
|
@ -71,24 +71,27 @@ void insertTargetInReverseHeap(Heap &reverse_heap, const PhantomNode &target)
|
|||||||
static constexpr bool FORWARD_DIRECTION = true;
|
static constexpr bool FORWARD_DIRECTION = true;
|
||||||
static constexpr bool REVERSE_DIRECTION = false;
|
static constexpr bool REVERSE_DIRECTION = false;
|
||||||
|
|
||||||
// Identify nodes in the forward(reverse) search direction that will require loop forcing
|
// Identify nodes in the forward(reverse) search direction that will require step forcing
|
||||||
// e.g. if source and destination nodes are on the same segment.
|
// e.g. if source and destination nodes are on the same segment.
|
||||||
std::vector<NodeID> getForwardLoopNodes(const PhantomEndpointCandidates &candidates);
|
std::vector<NodeID> getForwardForceNodes(const PhantomEndpointCandidates &candidates);
|
||||||
std::vector<NodeID> getForwardLoopNodes(const PhantomCandidatesToTarget &candidates);
|
std::vector<NodeID> getForwardForceNodes(const PhantomCandidatesToTarget &candidates);
|
||||||
std::vector<NodeID> getBackwardLoopNodes(const PhantomEndpointCandidates &candidates);
|
std::vector<NodeID> getBackwardForceNodes(const PhantomEndpointCandidates &candidates);
|
||||||
std::vector<NodeID> getBackwardLoopNodes(const PhantomCandidatesToTarget &candidates);
|
std::vector<NodeID> getBackwardForceNodes(const PhantomCandidatesToTarget &candidates);
|
||||||
|
|
||||||
// Find the specific phantom node endpoints for a given path from a list of candidates.
|
// Find the specific phantom node endpoints for a given path from a list of candidates.
|
||||||
PhantomEndpoints endpointsFromCandidates(const PhantomEndpointCandidates &candidates,
|
PhantomEndpoints endpointsFromCandidates(const PhantomEndpointCandidates &candidates,
|
||||||
const std::vector<NodeID> &path);
|
const std::vector<NodeID> &path);
|
||||||
|
|
||||||
template <typename HeapNodeT>
|
template <typename HeapNodeT>
|
||||||
inline bool force_loop(const std::vector<NodeID> &force_nodes, const HeapNodeT &heap_node)
|
inline bool shouldForceStep(const std::vector<NodeID> &force_nodes,
|
||||||
|
const HeapNodeT &forward_heap_node,
|
||||||
|
const HeapNodeT &reverse_heap_node)
|
||||||
{
|
{
|
||||||
// if loops are forced, they are so at the source
|
// routing steps are forced when the node is a source of both forward and reverse search heaps.
|
||||||
return !force_nodes.empty() &&
|
return forward_heap_node.data.parent == forward_heap_node.node &&
|
||||||
std::find(force_nodes.begin(), force_nodes.end(), heap_node.node) != force_nodes.end() &&
|
reverse_heap_node.data.parent == reverse_heap_node.node &&
|
||||||
heap_node.data.parent == heap_node.node;
|
std::find(force_nodes.begin(), force_nodes.end(), forward_heap_node.node) !=
|
||||||
|
force_nodes.end();
|
||||||
}
|
}
|
||||||
|
|
||||||
template <typename Heap>
|
template <typename Heap>
|
||||||
|
@ -112,8 +112,7 @@ void routingStep(const DataFacade<Algorithm> &facade,
|
|||||||
NodeID &middle_node_id,
|
NodeID &middle_node_id,
|
||||||
EdgeWeight &upper_bound,
|
EdgeWeight &upper_bound,
|
||||||
EdgeWeight min_edge_offset,
|
EdgeWeight min_edge_offset,
|
||||||
const std::vector<NodeID> &force_loop_forward_nodes,
|
const std::vector<NodeID> &force_step_nodes)
|
||||||
const std::vector<NodeID> &force_loop_reverse_nodes)
|
|
||||||
{
|
{
|
||||||
auto heapNode = forward_heap.DeleteMinGetHeapNode();
|
auto heapNode = forward_heap.DeleteMinGetHeapNode();
|
||||||
const auto reverseHeapNode = reverse_heap.GetHeapNodeIfWasInserted(heapNode.node);
|
const auto reverseHeapNode = reverse_heap.GetHeapNodeIfWasInserted(heapNode.node);
|
||||||
@ -123,13 +122,13 @@ void routingStep(const DataFacade<Algorithm> &facade,
|
|||||||
const EdgeWeight new_weight = reverseHeapNode->weight + heapNode.weight;
|
const EdgeWeight new_weight = reverseHeapNode->weight + heapNode.weight;
|
||||||
if (new_weight < upper_bound)
|
if (new_weight < upper_bound)
|
||||||
{
|
{
|
||||||
if (force_loop(force_loop_forward_nodes, heapNode) ||
|
if (shouldForceStep(force_step_nodes, heapNode, reverseHeapNode.get()) ||
|
||||||
force_loop(force_loop_reverse_nodes, heapNode) ||
|
|
||||||
// in this case we are looking at a bi-directional way where the source
|
// in this case we are looking at a bi-directional way where the source
|
||||||
// and target phantom are on the same edge based node
|
// and target phantom are on the same edge based node
|
||||||
new_weight < EdgeWeight{0})
|
new_weight < EdgeWeight{0})
|
||||||
{
|
{
|
||||||
// check whether there is a loop present at the node
|
// Before forcing step, check whether there is a loop present at the node.
|
||||||
|
// We may find a valid weight path by following the loop.
|
||||||
for (const auto edge : facade.GetAdjacentEdgeRange(heapNode.node))
|
for (const auto edge : facade.GetAdjacentEdgeRange(heapNode.node))
|
||||||
{
|
{
|
||||||
const auto &data = facade.GetEdgeData(edge);
|
const auto &data = facade.GetEdgeData(edge);
|
||||||
@ -421,23 +420,22 @@ void retrievePackedPathFromSingleManyToManyHeap(
|
|||||||
// assumes that heaps are already setup correctly.
|
// assumes that heaps are already setup correctly.
|
||||||
// ATTENTION: This only works if no additional offset is supplied next to the Phantom Node
|
// ATTENTION: This only works if no additional offset is supplied next to the Phantom Node
|
||||||
// Offsets.
|
// Offsets.
|
||||||
// In case additional offsets are supplied, you might have to force a loop first.
|
// In case additional offsets are supplied, you might have to force a routing step first.
|
||||||
// A forced loop might be necessary, if source and target are on the same segment.
|
// A forced step might be necessary, if source and target are on the same segment.
|
||||||
// If this is the case and the offsets of the respective direction are larger for the source
|
// If this is the case and the offsets of the respective direction are larger for the source
|
||||||
// than the target
|
// than the target
|
||||||
// then a force loop is required (e.g. source_phantom.forward_segment_id ==
|
// then a force step is required (e.g. source_phantom.forward_segment_id ==
|
||||||
// target_phantom.forward_segment_id
|
// target_phantom.forward_segment_id
|
||||||
// && source_phantom.GetForwardWeightPlusOffset() > target_phantom.GetForwardWeightPlusOffset())
|
// && source_phantom.GetForwardWeightPlusOffset() > target_phantom.GetForwardWeightPlusOffset())
|
||||||
// requires
|
// requires
|
||||||
// a force loop, if the heaps have been initialized with positive offsets.
|
// a force step, if the heaps have been initialized with positive offsets.
|
||||||
void search(SearchEngineData<Algorithm> &engine_working_data,
|
void search(SearchEngineData<Algorithm> &engine_working_data,
|
||||||
const DataFacade<Algorithm> &facade,
|
const DataFacade<Algorithm> &facade,
|
||||||
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
||||||
SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
EdgeWeight &weight,
|
EdgeWeight &weight,
|
||||||
std::vector<NodeID> &packed_leg,
|
std::vector<NodeID> &packed_leg,
|
||||||
const std::vector<NodeID> &force_loop_forward_node,
|
const std::vector<NodeID> &force_step_nodes,
|
||||||
const std::vector<NodeID> &force_loop_reverse_node,
|
|
||||||
const EdgeWeight duration_upper_bound = INVALID_EDGE_WEIGHT);
|
const EdgeWeight duration_upper_bound = INVALID_EDGE_WEIGHT);
|
||||||
|
|
||||||
template <typename PhantomEndpointT>
|
template <typename PhantomEndpointT>
|
||||||
@ -447,8 +445,7 @@ void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
EdgeWeight &weight,
|
EdgeWeight &weight,
|
||||||
std::vector<NodeID> &packed_leg,
|
std::vector<NodeID> &packed_leg,
|
||||||
const std::vector<NodeID> &force_loop_forward_node,
|
const std::vector<NodeID> &force_step_nodes,
|
||||||
const std::vector<NodeID> &force_loop_reverse_node,
|
|
||||||
const PhantomEndpointT & /*endpoints*/,
|
const PhantomEndpointT & /*endpoints*/,
|
||||||
const EdgeWeight duration_upper_bound = INVALID_EDGE_WEIGHT)
|
const EdgeWeight duration_upper_bound = INVALID_EDGE_WEIGHT)
|
||||||
{
|
{
|
||||||
@ -459,14 +456,13 @@ void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
reverse_heap,
|
reverse_heap,
|
||||||
weight,
|
weight,
|
||||||
packed_leg,
|
packed_leg,
|
||||||
force_loop_forward_node,
|
force_step_nodes,
|
||||||
force_loop_reverse_node,
|
|
||||||
duration_upper_bound);
|
duration_upper_bound);
|
||||||
}
|
}
|
||||||
|
|
||||||
// Requires the heaps for be empty
|
// Requires the heaps for be empty
|
||||||
// If heaps should be adjusted to be initialized outside of this function,
|
// If heaps should be adjusted to be initialized outside of this function,
|
||||||
// the addition of force_loop parameters might be required
|
// the addition of force_step parameters might be required
|
||||||
double getNetworkDistance(SearchEngineData<Algorithm> &engine_working_data,
|
double getNetworkDistance(SearchEngineData<Algorithm> &engine_working_data,
|
||||||
const DataFacade<ch::Algorithm> &facade,
|
const DataFacade<ch::Algorithm> &facade,
|
||||||
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
||||||
|
@ -389,8 +389,7 @@ void routingStep(const DataFacade<Algorithm> &facade,
|
|||||||
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
NodeID &middle_node,
|
NodeID &middle_node,
|
||||||
EdgeWeight &path_upper_bound,
|
EdgeWeight &path_upper_bound,
|
||||||
const std::vector<NodeID> &force_loop_forward_nodes,
|
const std::vector<NodeID> &force_step_nodes,
|
||||||
const std::vector<NodeID> &force_loop_reverse_nodes,
|
|
||||||
const Args &...args)
|
const Args &...args)
|
||||||
{
|
{
|
||||||
const auto heapNode = forward_heap.DeleteMinGetHeapNode();
|
const auto heapNode = forward_heap.DeleteMinGetHeapNode();
|
||||||
@ -409,11 +408,8 @@ void routingStep(const DataFacade<Algorithm> &facade,
|
|||||||
auto reverse_weight = reverseHeapNode->weight;
|
auto reverse_weight = reverseHeapNode->weight;
|
||||||
auto path_weight = weight + reverse_weight;
|
auto path_weight = weight + reverse_weight;
|
||||||
|
|
||||||
// MLD uses loops forcing only to prune single node paths in forward and/or
|
if (!shouldForceStep(force_step_nodes, heapNode, reverseHeapNode.get()) &&
|
||||||
// backward direction (there is no need to force loops in MLD but in CH)
|
(path_weight >= EdgeWeight{0}) && (path_weight < path_upper_bound))
|
||||||
if (!force_loop(force_loop_forward_nodes, heapNode) &&
|
|
||||||
!force_loop(force_loop_reverse_nodes, heapNode) && (path_weight >= EdgeWeight{0}) &&
|
|
||||||
(path_weight < path_upper_bound))
|
|
||||||
{
|
{
|
||||||
middle_node = heapNode.node;
|
middle_node = heapNode.node;
|
||||||
path_upper_bound = path_weight;
|
path_upper_bound = path_weight;
|
||||||
@ -438,8 +434,7 @@ UnpackedPath search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
const DataFacade<Algorithm> &facade,
|
const DataFacade<Algorithm> &facade,
|
||||||
typename SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
typename SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
||||||
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
const std::vector<NodeID> &force_loop_forward_nodes,
|
const std::vector<NodeID> &force_step_nodes,
|
||||||
const std::vector<NodeID> &force_loop_reverse_nodes,
|
|
||||||
EdgeWeight weight_upper_bound,
|
EdgeWeight weight_upper_bound,
|
||||||
const Args &...args)
|
const Args &...args)
|
||||||
{
|
{
|
||||||
@ -463,27 +458,15 @@ UnpackedPath search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
{
|
{
|
||||||
if (!forward_heap.Empty())
|
if (!forward_heap.Empty())
|
||||||
{
|
{
|
||||||
routingStep<FORWARD_DIRECTION>(facade,
|
routingStep<FORWARD_DIRECTION>(
|
||||||
forward_heap,
|
facade, forward_heap, reverse_heap, middle, weight, force_step_nodes, args...);
|
||||||
reverse_heap,
|
|
||||||
middle,
|
|
||||||
weight,
|
|
||||||
force_loop_forward_nodes,
|
|
||||||
force_loop_reverse_nodes,
|
|
||||||
args...);
|
|
||||||
if (!forward_heap.Empty())
|
if (!forward_heap.Empty())
|
||||||
forward_heap_min = forward_heap.MinKey();
|
forward_heap_min = forward_heap.MinKey();
|
||||||
}
|
}
|
||||||
if (!reverse_heap.Empty())
|
if (!reverse_heap.Empty())
|
||||||
{
|
{
|
||||||
routingStep<REVERSE_DIRECTION>(facade,
|
routingStep<REVERSE_DIRECTION>(
|
||||||
reverse_heap,
|
facade, reverse_heap, forward_heap, middle, weight, force_step_nodes, args...);
|
||||||
forward_heap,
|
|
||||||
middle,
|
|
||||||
weight,
|
|
||||||
force_loop_reverse_nodes,
|
|
||||||
force_loop_forward_nodes,
|
|
||||||
args...);
|
|
||||||
if (!reverse_heap.Empty())
|
if (!reverse_heap.Empty())
|
||||||
reverse_heap_min = reverse_heap.MinKey();
|
reverse_heap_min = reverse_heap.MinKey();
|
||||||
}
|
}
|
||||||
@ -512,9 +495,7 @@ UnpackedPath search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
|
|
||||||
for (auto const &packed_edge : packed_path)
|
for (auto const &packed_edge : packed_path)
|
||||||
{
|
{
|
||||||
NodeID source, target;
|
auto [source, target, overlay_edge] = packed_edge;
|
||||||
bool overlay_edge;
|
|
||||||
std::tie(source, target, overlay_edge) = packed_edge;
|
|
||||||
if (!overlay_edge)
|
if (!overlay_edge)
|
||||||
{ // a base graph edge
|
{ // a base graph edge
|
||||||
unpacked_nodes.push_back(target);
|
unpacked_nodes.push_back(target);
|
||||||
@ -534,18 +515,11 @@ UnpackedPath search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
forward_heap.Insert(source, {0}, {source});
|
forward_heap.Insert(source, {0}, {source});
|
||||||
reverse_heap.Insert(target, {0}, {target});
|
reverse_heap.Insert(target, {0}, {target});
|
||||||
|
|
||||||
// TODO: when structured bindings will be allowed change to
|
auto [subpath_weight, subpath_nodes, subpath_edges] = search(engine_working_data,
|
||||||
// auto [subpath_weight, subpath_source, subpath_target, subpath] = ...
|
|
||||||
EdgeWeight subpath_weight;
|
|
||||||
std::vector<NodeID> subpath_nodes;
|
|
||||||
std::vector<EdgeID> subpath_edges;
|
|
||||||
std::tie(subpath_weight, subpath_nodes, subpath_edges) =
|
|
||||||
search(engine_working_data,
|
|
||||||
facade,
|
facade,
|
||||||
forward_heap,
|
forward_heap,
|
||||||
reverse_heap,
|
reverse_heap,
|
||||||
force_loop_forward_nodes,
|
force_step_nodes,
|
||||||
force_loop_reverse_nodes,
|
|
||||||
INVALID_EDGE_WEIGHT,
|
INVALID_EDGE_WEIGHT,
|
||||||
sublevel,
|
sublevel,
|
||||||
parent_cell_id);
|
parent_cell_id);
|
||||||
@ -570,8 +544,7 @@ inline void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
EdgeWeight &weight,
|
EdgeWeight &weight,
|
||||||
std::vector<NodeID> &unpacked_nodes,
|
std::vector<NodeID> &unpacked_nodes,
|
||||||
const std::vector<NodeID> &force_loop_forward_node,
|
const std::vector<NodeID> &force_step_nodes,
|
||||||
const std::vector<NodeID> &force_loop_reverse_node,
|
|
||||||
const PhantomEndpointT &endpoints,
|
const PhantomEndpointT &endpoints,
|
||||||
const EdgeWeight weight_upper_bound = INVALID_EDGE_WEIGHT)
|
const EdgeWeight weight_upper_bound = INVALID_EDGE_WEIGHT)
|
||||||
{
|
{
|
||||||
@ -580,8 +553,7 @@ inline void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
facade,
|
facade,
|
||||||
forward_heap,
|
forward_heap,
|
||||||
reverse_heap,
|
reverse_heap,
|
||||||
force_loop_forward_node,
|
force_step_nodes,
|
||||||
force_loop_reverse_node,
|
|
||||||
weight_upper_bound,
|
weight_upper_bound,
|
||||||
endpoints);
|
endpoints);
|
||||||
}
|
}
|
||||||
@ -633,17 +605,8 @@ double getNetworkDistance(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
const PhantomEndpoints endpoints{source_phantom, target_phantom};
|
const PhantomEndpoints endpoints{source_phantom, target_phantom};
|
||||||
insertNodesInHeaps(forward_heap, reverse_heap, endpoints);
|
insertNodesInHeaps(forward_heap, reverse_heap, endpoints);
|
||||||
|
|
||||||
EdgeWeight weight = INVALID_EDGE_WEIGHT;
|
auto [weight, unpacked_nodes, unpacked_edges] = search(
|
||||||
std::vector<NodeID> unpacked_nodes;
|
engine_working_data, facade, forward_heap, reverse_heap, {}, weight_upper_bound, endpoints);
|
||||||
std::vector<EdgeID> unpacked_edges;
|
|
||||||
std::tie(weight, unpacked_nodes, unpacked_edges) = search(engine_working_data,
|
|
||||||
facade,
|
|
||||||
forward_heap,
|
|
||||||
reverse_heap,
|
|
||||||
{},
|
|
||||||
{},
|
|
||||||
weight_upper_bound,
|
|
||||||
endpoints);
|
|
||||||
|
|
||||||
if (weight == INVALID_EDGE_WEIGHT)
|
if (weight == INVALID_EDGE_WEIGHT)
|
||||||
{
|
{
|
||||||
|
@ -64,7 +64,6 @@ void searchWithUTurn(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
leg_weight,
|
leg_weight,
|
||||||
leg_packed_path,
|
leg_packed_path,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
candidates);
|
candidates);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -124,8 +123,7 @@ void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
reverse_heap,
|
reverse_heap,
|
||||||
new_total_weight_to_forward,
|
new_total_weight_to_forward,
|
||||||
leg_packed_path_forward,
|
leg_packed_path_forward,
|
||||||
getForwardLoopNodes(candidates),
|
getForwardForceNodes(candidates),
|
||||||
{},
|
|
||||||
candidates);
|
candidates);
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -164,8 +162,7 @@ void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
reverse_heap,
|
reverse_heap,
|
||||||
new_total_weight_to_reverse,
|
new_total_weight_to_reverse,
|
||||||
leg_packed_path_reverse,
|
leg_packed_path_reverse,
|
||||||
{},
|
getBackwardForceNodes(candidates),
|
||||||
getBackwardLoopNodes(candidates),
|
|
||||||
candidates);
|
candidates);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
@ -187,7 +187,6 @@ void computeWeightAndSharingOfViaPath(SearchEngineData<Algorithm> &engine_workin
|
|||||||
s_v_middle,
|
s_v_middle,
|
||||||
upper_bound_s_v_path_weight,
|
upper_bound_s_v_path_weight,
|
||||||
min_edge_offset,
|
min_edge_offset,
|
||||||
{},
|
|
||||||
{});
|
{});
|
||||||
}
|
}
|
||||||
// compute path <v,..,t> by reusing backward search from node t
|
// compute path <v,..,t> by reusing backward search from node t
|
||||||
@ -202,7 +201,6 @@ void computeWeightAndSharingOfViaPath(SearchEngineData<Algorithm> &engine_workin
|
|||||||
v_t_middle,
|
v_t_middle,
|
||||||
upper_bound_of_v_t_path_weight,
|
upper_bound_of_v_t_path_weight,
|
||||||
min_edge_offset,
|
min_edge_offset,
|
||||||
{},
|
|
||||||
{});
|
{});
|
||||||
}
|
}
|
||||||
*real_weight_of_via_path = upper_bound_s_v_path_weight + upper_bound_of_v_t_path_weight;
|
*real_weight_of_via_path = upper_bound_s_v_path_weight + upper_bound_of_v_t_path_weight;
|
||||||
@ -348,7 +346,6 @@ bool viaNodeCandidatePassesTTest(SearchEngineData<Algorithm> &engine_working_dat
|
|||||||
*s_v_middle,
|
*s_v_middle,
|
||||||
upper_bound_s_v_path_weight,
|
upper_bound_s_v_path_weight,
|
||||||
min_edge_offset,
|
min_edge_offset,
|
||||||
{},
|
|
||||||
{});
|
{});
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -369,7 +366,6 @@ bool viaNodeCandidatePassesTTest(SearchEngineData<Algorithm> &engine_working_dat
|
|||||||
*v_t_middle,
|
*v_t_middle,
|
||||||
upper_bound_of_v_t_path_weight,
|
upper_bound_of_v_t_path_weight,
|
||||||
min_edge_offset,
|
min_edge_offset,
|
||||||
{},
|
|
||||||
{});
|
{});
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -538,12 +534,12 @@ bool viaNodeCandidatePassesTTest(SearchEngineData<Algorithm> &engine_working_dat
|
|||||||
if (!forward_heap3.Empty())
|
if (!forward_heap3.Empty())
|
||||||
{
|
{
|
||||||
routingStep<FORWARD_DIRECTION>(
|
routingStep<FORWARD_DIRECTION>(
|
||||||
facade, forward_heap3, reverse_heap3, middle, upper_bound, min_edge_offset, {}, {});
|
facade, forward_heap3, reverse_heap3, middle, upper_bound, min_edge_offset, {});
|
||||||
}
|
}
|
||||||
if (!reverse_heap3.Empty())
|
if (!reverse_heap3.Empty())
|
||||||
{
|
{
|
||||||
routingStep<REVERSE_DIRECTION>(
|
routingStep<REVERSE_DIRECTION>(
|
||||||
facade, reverse_heap3, forward_heap3, middle, upper_bound, min_edge_offset, {}, {});
|
facade, reverse_heap3, forward_heap3, middle, upper_bound, min_edge_offset, {});
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return (upper_bound <= t_test_path_weight);
|
return (upper_bound <= t_test_path_weight);
|
||||||
|
@ -631,7 +631,6 @@ void unpackPackedPaths(InputIt first,
|
|||||||
forward_heap,
|
forward_heap,
|
||||||
reverse_heap,
|
reverse_heap,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
INVALID_EDGE_WEIGHT,
|
INVALID_EDGE_WEIGHT,
|
||||||
sublevel,
|
sublevel,
|
||||||
parent_cell_id);
|
parent_cell_id);
|
||||||
@ -720,7 +719,6 @@ makeCandidateVias(SearchEngineData<Algorithm> &search_engine_data,
|
|||||||
overlap_via,
|
overlap_via,
|
||||||
overlap_weight,
|
overlap_weight,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
endpoint_candidates);
|
endpoint_candidates);
|
||||||
|
|
||||||
if (!forward_heap.Empty())
|
if (!forward_heap.Empty())
|
||||||
@ -746,7 +744,6 @@ makeCandidateVias(SearchEngineData<Algorithm> &search_engine_data,
|
|||||||
overlap_via,
|
overlap_via,
|
||||||
overlap_weight,
|
overlap_weight,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
endpoint_candidates);
|
endpoint_candidates);
|
||||||
|
|
||||||
if (!reverse_heap.Empty())
|
if (!reverse_heap.Empty())
|
||||||
|
@ -34,7 +34,6 @@ InternalRouteResult directShortestPathSearch(SearchEngineData<ch::Algorithm> &en
|
|||||||
weight,
|
weight,
|
||||||
packed_leg,
|
packed_leg,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
endpoint_candidates);
|
endpoint_candidates);
|
||||||
|
|
||||||
std::vector<NodeID> unpacked_nodes;
|
std::vector<NodeID> unpacked_nodes;
|
||||||
@ -81,7 +80,6 @@ InternalRouteResult directShortestPathSearch(SearchEngineData<mld::Algorithm> &e
|
|||||||
forward_heap,
|
forward_heap,
|
||||||
reverse_heap,
|
reverse_heap,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
INVALID_EDGE_WEIGHT,
|
INVALID_EDGE_WEIGHT,
|
||||||
endpoint_candidates);
|
endpoint_candidates);
|
||||||
|
|
||||||
|
@ -3,21 +3,29 @@
|
|||||||
namespace osrm::engine::routing_algorithms
|
namespace osrm::engine::routing_algorithms
|
||||||
{
|
{
|
||||||
|
|
||||||
bool requiresForwardLoop(const PhantomNode &source, const PhantomNode &target)
|
bool requiresForwardForce(const PhantomNode &source, const PhantomNode &target)
|
||||||
{
|
{
|
||||||
|
// Conditions to force a routing step:
|
||||||
|
// - Valid source and target.
|
||||||
|
// - Source and target on same segment.
|
||||||
|
// - Source is "downstream" of target in the direction of the edge.
|
||||||
return source.IsValidForwardSource() && target.IsValidForwardTarget() &&
|
return source.IsValidForwardSource() && target.IsValidForwardTarget() &&
|
||||||
source.forward_segment_id.id == target.forward_segment_id.id &&
|
source.forward_segment_id.id == target.forward_segment_id.id &&
|
||||||
source.GetForwardWeightPlusOffset() > target.GetForwardWeightPlusOffset();
|
source.GetForwardWeightPlusOffset() > target.GetForwardWeightPlusOffset();
|
||||||
}
|
}
|
||||||
|
|
||||||
bool requiresBackwardLoop(const PhantomNode &source, const PhantomNode &target)
|
bool requiresBackwardForce(const PhantomNode &source, const PhantomNode &target)
|
||||||
{
|
{
|
||||||
|
// Conditions to force a routing step:
|
||||||
|
// - Valid source and target.
|
||||||
|
// - Source and target on same segment.
|
||||||
|
// - Source is "downstream" of target in the direction of the edge.
|
||||||
return source.IsValidReverseSource() && target.IsValidReverseTarget() &&
|
return source.IsValidReverseSource() && target.IsValidReverseTarget() &&
|
||||||
source.reverse_segment_id.id == target.reverse_segment_id.id &&
|
source.reverse_segment_id.id == target.reverse_segment_id.id &&
|
||||||
source.GetReverseWeightPlusOffset() > target.GetReverseWeightPlusOffset();
|
source.GetReverseWeightPlusOffset() > target.GetReverseWeightPlusOffset();
|
||||||
}
|
}
|
||||||
|
|
||||||
std::vector<NodeID> getForwardLoopNodes(const PhantomEndpointCandidates &endpoint_candidates)
|
std::vector<NodeID> getForwardForceNodes(const PhantomEndpointCandidates &endpoint_candidates)
|
||||||
{
|
{
|
||||||
std::vector<NodeID> res;
|
std::vector<NodeID> res;
|
||||||
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
||||||
@ -26,7 +34,7 @@ std::vector<NodeID> getForwardLoopNodes(const PhantomEndpointCandidates &endpoin
|
|||||||
std::any_of(endpoint_candidates.target_phantoms.begin(),
|
std::any_of(endpoint_candidates.target_phantoms.begin(),
|
||||||
endpoint_candidates.target_phantoms.end(),
|
endpoint_candidates.target_phantoms.end(),
|
||||||
[&](const auto &target_phantom)
|
[&](const auto &target_phantom)
|
||||||
{ return requiresForwardLoop(source_phantom, target_phantom); });
|
{ return requiresForwardForce(source_phantom, target_phantom); });
|
||||||
if (requires_loop)
|
if (requires_loop)
|
||||||
{
|
{
|
||||||
res.push_back(source_phantom.forward_segment_id.id);
|
res.push_back(source_phantom.forward_segment_id.id);
|
||||||
@ -35,12 +43,12 @@ std::vector<NodeID> getForwardLoopNodes(const PhantomEndpointCandidates &endpoin
|
|||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
std::vector<NodeID> getForwardLoopNodes(const PhantomCandidatesToTarget &endpoint_candidates)
|
std::vector<NodeID> getForwardForceNodes(const PhantomCandidatesToTarget &endpoint_candidates)
|
||||||
{
|
{
|
||||||
std::vector<NodeID> res;
|
std::vector<NodeID> res;
|
||||||
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
||||||
{
|
{
|
||||||
if (requiresForwardLoop(source_phantom, endpoint_candidates.target_phantom))
|
if (requiresForwardForce(source_phantom, endpoint_candidates.target_phantom))
|
||||||
{
|
{
|
||||||
res.push_back(source_phantom.forward_segment_id.id);
|
res.push_back(source_phantom.forward_segment_id.id);
|
||||||
}
|
}
|
||||||
@ -48,7 +56,7 @@ std::vector<NodeID> getForwardLoopNodes(const PhantomCandidatesToTarget &endpoin
|
|||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
std::vector<NodeID> getBackwardLoopNodes(const PhantomEndpointCandidates &endpoint_candidates)
|
std::vector<NodeID> getBackwardForceNodes(const PhantomEndpointCandidates &endpoint_candidates)
|
||||||
{
|
{
|
||||||
std::vector<NodeID> res;
|
std::vector<NodeID> res;
|
||||||
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
||||||
@ -57,7 +65,7 @@ std::vector<NodeID> getBackwardLoopNodes(const PhantomEndpointCandidates &endpoi
|
|||||||
std::any_of(endpoint_candidates.target_phantoms.begin(),
|
std::any_of(endpoint_candidates.target_phantoms.begin(),
|
||||||
endpoint_candidates.target_phantoms.end(),
|
endpoint_candidates.target_phantoms.end(),
|
||||||
[&](const auto &target_phantom)
|
[&](const auto &target_phantom)
|
||||||
{ return requiresBackwardLoop(source_phantom, target_phantom); });
|
{ return requiresBackwardForce(source_phantom, target_phantom); });
|
||||||
if (requires_loop)
|
if (requires_loop)
|
||||||
{
|
{
|
||||||
res.push_back(source_phantom.reverse_segment_id.id);
|
res.push_back(source_phantom.reverse_segment_id.id);
|
||||||
@ -66,12 +74,12 @@ std::vector<NodeID> getBackwardLoopNodes(const PhantomEndpointCandidates &endpoi
|
|||||||
return res;
|
return res;
|
||||||
}
|
}
|
||||||
|
|
||||||
std::vector<NodeID> getBackwardLoopNodes(const PhantomCandidatesToTarget &endpoint_candidates)
|
std::vector<NodeID> getBackwardForceNodes(const PhantomCandidatesToTarget &endpoint_candidates)
|
||||||
{
|
{
|
||||||
std::vector<NodeID> res;
|
std::vector<NodeID> res;
|
||||||
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
for (const auto &source_phantom : endpoint_candidates.source_phantoms)
|
||||||
{
|
{
|
||||||
if (requiresBackwardLoop(source_phantom, endpoint_candidates.target_phantom))
|
if (requiresBackwardForce(source_phantom, endpoint_candidates.target_phantom))
|
||||||
{
|
{
|
||||||
res.push_back(source_phantom.reverse_segment_id.id);
|
res.push_back(source_phantom.reverse_segment_id.id);
|
||||||
}
|
}
|
||||||
|
@ -73,23 +73,22 @@ void retrievePackedPathFromSingleManyToManyHeap(
|
|||||||
// assumes that heaps are already setup correctly.
|
// assumes that heaps are already setup correctly.
|
||||||
// ATTENTION: This only works if no additional offset is supplied next to the Phantom Node
|
// ATTENTION: This only works if no additional offset is supplied next to the Phantom Node
|
||||||
// Offsets.
|
// Offsets.
|
||||||
// In case additional offsets are supplied, you might have to force a loop first.
|
// In case additional offsets are supplied, you might have to force a routing step first.
|
||||||
// A forced loop might be necessary, if source and target are on the same segment.
|
// A forced step might be necessary, if source and target are on the same segment.
|
||||||
// If this is the case and the offsets of the respective direction are larger for the source
|
// If this is the case and the offsets of the respective direction are larger for the source
|
||||||
// than the target
|
// than the target
|
||||||
// then a force loop is required (e.g. source_phantom.forward_segment_id ==
|
// then a force step is required (e.g. source_phantom.forward_segment_id ==
|
||||||
// target_phantom.forward_segment_id
|
// target_phantom.forward_segment_id
|
||||||
// && source_phantom.GetForwardWeightPlusOffset() > target_phantom.GetForwardWeightPlusOffset())
|
// && source_phantom.GetForwardWeightPlusOffset() > target_phantom.GetForwardWeightPlusOffset())
|
||||||
// requires
|
// requires
|
||||||
// a force loop, if the heaps have been initialized with positive offsets.
|
// a force step, if the heaps have been initialized with positive offsets.
|
||||||
void search(SearchEngineData<Algorithm> & /*engine_working_data*/,
|
void search(SearchEngineData<Algorithm> & /*engine_working_data*/,
|
||||||
const DataFacade<Algorithm> &facade,
|
const DataFacade<Algorithm> &facade,
|
||||||
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
||||||
SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
EdgeWeight &weight,
|
EdgeWeight &weight,
|
||||||
std::vector<NodeID> &packed_leg,
|
std::vector<NodeID> &packed_leg,
|
||||||
const std::vector<NodeID> &force_loop_forward_nodes,
|
const std::vector<NodeID> &force_step_nodes,
|
||||||
const std::vector<NodeID> &force_loop_reverse_nodes,
|
|
||||||
const EdgeWeight weight_upper_bound)
|
const EdgeWeight weight_upper_bound)
|
||||||
{
|
{
|
||||||
if (forward_heap.Empty() || reverse_heap.Empty())
|
if (forward_heap.Empty() || reverse_heap.Empty())
|
||||||
@ -118,8 +117,7 @@ void search(SearchEngineData<Algorithm> & /*engine_working_data*/,
|
|||||||
middle,
|
middle,
|
||||||
weight,
|
weight,
|
||||||
min_edge_offset,
|
min_edge_offset,
|
||||||
force_loop_forward_nodes,
|
force_step_nodes);
|
||||||
force_loop_reverse_nodes);
|
|
||||||
}
|
}
|
||||||
if (!reverse_heap.Empty())
|
if (!reverse_heap.Empty())
|
||||||
{
|
{
|
||||||
@ -129,8 +127,7 @@ void search(SearchEngineData<Algorithm> & /*engine_working_data*/,
|
|||||||
middle,
|
middle,
|
||||||
weight,
|
weight,
|
||||||
min_edge_offset,
|
min_edge_offset,
|
||||||
force_loop_reverse_nodes,
|
force_step_nodes);
|
||||||
force_loop_forward_nodes);
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
@ -159,7 +156,7 @@ void search(SearchEngineData<Algorithm> & /*engine_working_data*/,
|
|||||||
|
|
||||||
// Requires the heaps for be empty
|
// Requires the heaps for be empty
|
||||||
// If heaps should be adjusted to be initialized outside of this function,
|
// If heaps should be adjusted to be initialized outside of this function,
|
||||||
// the addition of force_loop parameters might be required
|
// the addition of force_step parameters might be required
|
||||||
double getNetworkDistance(SearchEngineData<Algorithm> &engine_working_data,
|
double getNetworkDistance(SearchEngineData<Algorithm> &engine_working_data,
|
||||||
const DataFacade<Algorithm> &facade,
|
const DataFacade<Algorithm> &facade,
|
||||||
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
SearchEngineData<Algorithm>::QueryHeap &forward_heap,
|
||||||
@ -183,7 +180,6 @@ double getNetworkDistance(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
weight,
|
weight,
|
||||||
packed_path,
|
packed_path,
|
||||||
{},
|
{},
|
||||||
{},
|
|
||||||
endpoints,
|
endpoints,
|
||||||
weight_upper_bound);
|
weight_upper_bound);
|
||||||
|
|
||||||
|
@ -334,8 +334,7 @@ inline void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
typename SearchEngineData<Algorithm>::QueryHeap &reverse_heap,
|
||||||
EdgeWeight &weight,
|
EdgeWeight &weight,
|
||||||
std::vector<NodeID> &packed_leg,
|
std::vector<NodeID> &packed_leg,
|
||||||
const std::vector<NodeID> &forward_loop_nodes,
|
const std::vector<NodeID> &loop_nodes,
|
||||||
const std::vector<NodeID> &reverse_loop_nodes,
|
|
||||||
const PhantomT &endpoints,
|
const PhantomT &endpoints,
|
||||||
const EdgeWeight weight_upper_bound = INVALID_EDGE_WEIGHT)
|
const EdgeWeight weight_upper_bound = INVALID_EDGE_WEIGHT)
|
||||||
{
|
{
|
||||||
@ -345,8 +344,7 @@ inline void search(SearchEngineData<Algorithm> &engine_working_data,
|
|||||||
reverse_heap,
|
reverse_heap,
|
||||||
weight,
|
weight,
|
||||||
packed_leg,
|
packed_leg,
|
||||||
forward_loop_nodes,
|
loop_nodes,
|
||||||
reverse_loop_nodes,
|
|
||||||
endpoints,
|
endpoints,
|
||||||
weight_upper_bound);
|
weight_upper_bound);
|
||||||
}
|
}
|
||||||
|
Loading…
Reference in New Issue
Block a user