You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @tkrajina,
again I'm looking into how the infirmation (like time, duration, etc.) could be more close zo Strava. My assumption is that Strava is quite good in analyzing data; so that's my baseline (to explain my motivation).
In your calculation of MovingData the const "defaultStoppedSpeedThreshold" is used to determine if a Point should be used for the sum of the data.
Basically it says. "If the speed in km/h from the previous to the actual point is above then 1km/h then use it for the MovingDistance and MovingTime (vice versa for StoppedDistance and StoppedTime)"
Again fair enough since it's quite fast.
But again the information of Strava is showing different information for MovingData. The following referenced files are used:
Points: 1810
Length 2D (km): 7.672807397924687
Length 3D (km): 7.675121481862096
Length Vincenty (km): 7.689805197559434
--- Moving Data ---
Moving Data - Moving time (sec): 39m4s
Moving Data - Stopped time (sec): 1m22s
Moving Data - Moving Distance (km): 7.662678032398673
Moving Data - Stopped Distance (km): 0.012443449463421121
Moving Data - Max speed: 3.158368m/s = 11.370125km/h
So again 3min different for a shorter (and faster) run.
My idea is that the standard deviation is used to determine which Points should be used for the calculation of MovingData. For example: 95% within the standard deviation should be used.
For that I've implemented the following algorithm:
func (seg*GPXTrackSegment) MovingDataStandardDeviation(sigmafloat64) MovingData {
var (
movingTimefloat64stoppedTimefloat64movingDistancefloat64stoppedDistancefloat64
)
speedsDistances:=make([]SpeedsAndDistances, 0)
// 1. Define the mean mu (μ) for a population series: All summed values / count of valuesμ:=seg.Duration() /float64(len(seg.Points)) // The mean mu (μ) for a population series// 2.a) Define Deviation for each point: (x1−μ)// 2.b) Square each deviation: (x1−μ)^2// 2.c) Sum all squared deviation from each pointvarsquaredDeviationSumfloat64// Sum of all squared deviation from each point// ToDo: point.Duration - Can't it be caluclated by parsing the xml?// The first point in the slice seg.Points is the first point at all; it does not have any previous point, so it does not have any durationallPoints:=seg.Points[:0] // https://github.com/golang/go/wiki/SliceTricks#filtering-without-allocatingfori:=1; i<len(seg.Points); i++ {
previousPoint:=seg.Points[i-1]
point:=seg.Points[i]
// ToDo: How to manipulate data in a slice// ptDiff := point.TimeDiff(&previousPoint)timedelta:=point.Timestamp.Sub(previousPoint.Timestamp)
point.Duration=timedelta.Seconds()
allPoints=append(allPoints, point)
squaredDeviationSum+=math.Pow(point.Duration-μ, 2)
}
// 3. Define the variance of the population: Divide the sum of all squared deviation of each points by the number of the population (in the previous step we used all point except the first one: len(seg.Points)-1)variance:=squaredDeviationSum/float64((len(seg.Points) -1))
// 4. Define the standard deviationstandardDeviation:=math.Sqrt(variance)
// 5. Define the the x1 and x2 value in which the points should be (sigma σ defines the range)x1:=μ-sigma*standardDeviationx2:=μ+sigma*standardDeviation// Use only the poins which are in the range x1 < point.Duration < x2fori:=0; i<len(allPoints); i++ {
// The first Point in allPoints is the second in seg.PointsvarpreviousPointGPXPointifi==0 {
previousPoint=seg.Points[0]
} else {
previousPoint=allPoints[i-1]
}
point:=allPoints[i]
// timedelta := point.Timestamp.Sub(previousPoint.Timestamp)// point.Duration = timedelta.Seconds()ifx1<=point.Duration&&point.Duration<=x2 {
distance, err:=point.DistanceVincenty(&previousPoint)
iferr!=nil {
fmt.Printf("Standard Deviation Error: %s\n", err)
distance=point.Distance3D(&previousPoint)
}
// distance := point.Distance3D(&previousPoint)movingDistance+=distancemovingTime+=point.Durationsd:=SpeedsAndDistances{distance*1000/point.Duration, distance} // distance (km) / point.Duration (sec) / 60 (sec-min) / 60 (min->h) -> km/hspeedsDistances=append(speedsDistances, sd)
} else {
stoppedTime+=point.Durationdistance, err:=point.DistanceVincenty(&previousPoint)
iferr!=nil {
fmt.Printf("Standard Deviation Error: %s\n", err)
distance=point.Distance3D(&previousPoint)
}
// distance := point.Distance3D(&previousPoint)stoppedDistance+=distance
}
}
varmaxSpeedfloat64iflen(speedsDistances) >0 {
maxSpeed=CalcMaxSpeed(speedsDistances)
ifmath.IsNaN(maxSpeed) {
maxSpeed=0
}
}
returnMovingData{
movingTime,
stoppedTime,
movingDistance,
stoppedDistance,
maxSpeed,
}
}
The information for the gpx files are as follows (the gpx information as a faster reference):
--- File 1 ---
(Strava MovingDistance: 68,69km - MovingTime: 2:48:18)
Points: 9733
Length 2D (km): 68.82811000360651
Length 3D (km): 68.84089953386604
Length Vincenty (km): 69.04083832569845
--- Moving Data ---
Moving Data - Moving time (sec): 3h5m32s
Moving Data - Stopped time (sec): 2m39s
Moving Data - Moving Distance (km): 68.82542650438606
Moving Data - Stopped Distance (km): 0.015473029479977595
Moving Data - Max speed: 5.500404m/s = 19.801453km/h
--- MovingDataStandardDeviation(1.644854 ~ 90%) ---
Standard Deviation - Moving time (sec): 2h47m9s
Standard Deviation - Stopped time (sec): 21m2s
Standard Deviation - Moving Distance (km): 68.62288196221773
Standard Deviation - Stopped Distance (km): 0.40151065842371714
Standard Deviation - Max speed: 6.633537m/s = 23.880733km/h
--- MovingDataStandardDeviation(1.959964 ~ 95%) ---
Standard Deviation - Moving time (sec): 2h47m9s
Standard Deviation - Stopped time (sec): 20m27s
Standard Deviation - Moving Distance: 68.61874042819656
Standard Deviation - Stopped Distance: 0.40151065842371714
Standard Deviation - Max speed: 5.517204m/s = 19.861934km/h
--- File 2 ---
(Strava MovingDistance: 7,68km - MovingTime: 36:42)
Points: 1810
Length 2D (km): 7.672807397924687
Length 3D (km): 7.675121481862096
Length Vincenty (km): 7.689805197559434
--- Moving Data ---
Moving Data - Moving time (sec): 39m4s
Moving Data - Stopped time (sec): 1m22s
Moving Data - Moving Distance (km): 7.662678032398673
Moving Data - Stopped Distance (km): 0.012443449463421121
Moving Data - Max speed: 3.158368m/s = 11.370125km/h
--- MovingDataStandardDeviation(1.644854 ~ 90%) ---
Standard Deviation - Moving time (sec): 36m41s
Standard Deviation - Stopped time (sec): 3m45s
Standard Deviation - Moving Distance (km): 7.5799742324965536
Standard Deviation - Stopped Distance (km): 0.09709473873435742
Standard Deviation - Max speed: 3.165000m/s = 11.393999km/h
--- MovingDataStandardDeviation(1.959964 ~ 95%) ---
Standard Deviation - Moving time (sec): 36m47s
Standard Deviation - Stopped time (sec): 3m33s
Standard Deviation - Moving Distance: 7.579203887420705
Standard Deviation - Stopped Distance: 0.09401947152890949
Standard Deviation - Max speed: 3.643202m/s = 13.115526km/h
So with the Confidence interval 1.644854σ (~90%) the MovingeTime is quite close.
Maybe an idea for the library?
Now the funny part: The distance (gpx length2d, ength3d, vincenty) and the distance of the MovingData (which is new calculated for the segments) is always different (in the normal MovingData
or MovingDataStandardDeviation).
P.s. I've noticed that for each func like gpx.Length3D or MovingData an iteration over the points is used. From my point of view all information (like distance between points, speed between points, etc.) could be done once by parsing the gpx file and the accumalting the other information for segment, track and gpx. That would mean the information are caluclated by parsing the file and not every time a new func is called. What is the prupose of your idea?
Thanks again for the library!
The text was updated successfully, but these errors were encountered:
Hi @tkrajina,
again I'm looking into how the infirmation (like time, duration, etc.) could be more close zo Strava. My assumption is that Strava is quite good in analyzing data; so that's my baseline (to explain my motivation).
In your calculation of MovingData the const "defaultStoppedSpeedThreshold" is used to determine if a Point should be used for the sum of the data.
gpxgo/gpx/gpx.go
Line 1156 in b3936e3
Basically it says. "If the speed in km/h from the previous to the actual point is above then 1km/h then use it for the MovingDistance and MovingTime (vice versa for StoppedDistance and StoppedTime)"
Again fair enough since it's quite fast.
But again the information of Strava is showing different information for MovingData. The following referenced files are used:
Referenced gpx files:
(1) https://gist.github.com/mbecker/a44881bfe29b0982fac6c69cae498125
Strava MovingDistance: 68,69km - MovingTime: 2:48:18
gpxgo is calculating the following information:
(I had a longer pause during the trip)
So from my real trip I could say that the 3h5m32s is too long. And the difference to Strava is too much.
(2) https://gist.github.com/mbecker/85db2ad9660417e429ce29fa09983021
Strava MovingDistance: 7,68km - MovingTime: 36:42
gpxgo is calculating the following information:
So again 3min different for a shorter (and faster) run.
My idea is that the standard deviation is used to determine which Points should be used for the calculation of MovingData. For example: 95% within the standard deviation should be used.
For that I've implemented the following algorithm:
The information for the gpx files are as follows (the gpx information as a faster reference):
So with the Confidence interval 1.644854σ (~90%) the MovingeTime is quite close.
Maybe an idea for the library?
Now the funny part: The distance (gpx length2d, ength3d, vincenty) and the distance of the MovingData (which is new calculated for the segments) is always different (in the normal MovingData
or MovingDataStandardDeviation).
P.s. I've noticed that for each func like gpx.Length3D or MovingData an iteration over the points is used. From my point of view all information (like distance between points, speed between points, etc.) could be done once by parsing the gpx file and the accumalting the other information for segment, track and gpx. That would mean the information are caluclated by parsing the file and not every time a new func is called. What is the prupose of your idea?
Thanks again for the library!
The text was updated successfully, but these errors were encountered: