Probabilistic modelling for flow density relationship

Thumbnail Image
File version
Primary Supervisor
Oh, Yan-Nam
Other Supervisors
Qu, Xiaobo
Shahidi, Amir Etemad
File type(s)

Speed - density relationship is the foundation of the traffic flow theory and transportation engineering. It represents the mathematical relationship among the three significant traffic parameters - traffic flow, speed and density. Since the speed-density relationship was first introduced by Greenshields in 1935, the development of this relationship has been greatly increased and there are numerous models to represent the relationship (e.g. Greenberg model; Underwood model; Newell model; Northwestern model; Wang et al. model). The speed-density relationship function is expected to have both empirical accuracy and mathematical elegance. It was long believed that single-regime models could not well represent all traffic states ranging from free flow conditions to jam conditions. In this thesis, field data was collected on the Georgia State Route 400, I-80, US101 in USA and M1 Motorway in AU. According to the literature, existing single-regime deterministic models calibrated by the least square method (LSM) could not fit the empirical data consistently well throughout the whole traffic state, especially in congested conditions. However, we found that the inaccuracy of the deterministic models is not caused solely by their functional forms, but also by the sample selection bias. It is because the observational database has poor quality as most of the data points refer to free flow condition. The calibrated models are likely to be dominated by free flow conditions, which results in poor performances for congested traffic states. Therefore, we propose two methods to resolve the sample selection bias. Firstly, the weighted least square method (WLSM) was used to solve the sample bias problems. We proposed three weighting methods to calibrate six single-regime deterministic models. According to our calibration results, these models, used the WLSM, to fit the dataset reasonably as it well represented all traffic states ranging from free flow conditions to traffic jam conditions. Furthermore, model validation part provides the results of relative errors, mean square error (MSE), root-mean-square deviation (RMSE). In addition, a theoretical investigation revealed the deficiency of LSM when conducted. The results showed that the inaccuracy of single regime speed-density models was not caused solely by their functional forms, but also by sample selection bias. For the other method, we used a fundamentally different approach that was able to yield very similar and consistent results with the previous WLSM model. The proposed approach applies reproducible sample generation to convert the observational data to experimental data. Then, the traditional least square method (LSM) could subsequently be applied to calibrate accurate traffic flow fundamental diagrams. Two reproducible sample generation approaches were proposed in this research. Based on our analysis, the first approach was somewhat affected by outliers and the second approach was more robust in dealing with potential outliers. As per our data, speed possesses a high degree of randomness for a given traffic state, which is more appropriate to be represented by random variables than deterministic numbers. The study then proceeds to propose a probabilistic speed-density relationship to represent the variance of speed by a given density value. In this section, we applied a new calibration approach to generate stochastic traffic flow fundamental diagrams. We first proved that the percentile based fundamental diagrams were obtainable based on the proposed model. We further proved that the proposed model had continuity, differentiability and convexity properties so that it could be easily solved by Gauss-Newton method. By selecting different percentile values from 0 to 1, the speed distributions at any given densities could be derived. The calibrated speed distributions perfectly fitted the GA400 dataset. This proposed methodology has wide applications. First, new approaches can be proposed to evaluate the performance of calibrated fundamental diagrams by taking into account not only the residual but also ability to reflect the stochasticity of samples. Secondly, stochastic fundamental diagrams can be used to develop and evaluate traffic control strategies. In particular, the proposed stochastic fundamental diagram is applicable to model and to optimize the connected and automated vehicles at the macroscopic level with an objective to reduce the stochasticity of traffic flow. Last but not the least, this proposed methodology can be applied to generate the stochastic models for most regression models with scattered samples.

Journal Title
Conference Title
Book Title
Thesis Type
Thesis (PhD Doctorate)
Degree Program
Doctor of Philosophy (PhD)
School of Eng & Built Env
Publisher link
Patent number
Grant identifier(s)
Rights Statement
Rights Statement
The author owns the copyright in this thesis, unless stated otherwise.
Item Access Status
Access the data
Related item(s)
Flow density
Stochastic traffic flow
Traffic flow
Persistent link to this record