Dynamic Pricing for Wireless Charging Lane Management Based on Deep Reinforcement Learning

Research output: Journal PublicationArticlepeer-review

Abstract

We consider a dynamic pricing problem in a double-lane system consisting of one general purpose lane and one wireless charging lane (WCL). The electricity price is dynamically adjusted to affect the lane-choice behaviors of incoming electric vehicles (EVs), thereby regulating the traffic assignment between the two lanes with both traffic operation efficiency and charging service efficiency considered in the control objective. We first establish an agent-based dynamic double-lane traffic system model, whereby each EV acts as an agent with distinct behavioral and operational characteristics. Then, a deep Q-learning algorithm is proposed to derive the optimal pricing decisions. A regression tree (CART) algorithm is also designed for benchmarking. The simulation results reveal that the deep Q-learning algorithm demonstrates superior capability in optimizing dynamic pricing strategies compared to CART by more effectively leveraging system dynamics and future traffic demand information, and both outperform the static pricing strategy. This study serves as a pioneering work to explore dynamic pricing issues for WCLs.
Original languageEnglish
Article number9831
JournalSustainability
Volume17
Issue number21
DOIs
Publication statusPublished - 4 Nov 2025

Free Keywords

  • dynamic pricing
  • wireless charging lane
  • electric vehicle
  • deep reinforcement learning
  • agent-based modeling

Fingerprint

Dive into the research topics of 'Dynamic Pricing for Wireless Charging Lane Management Based on Deep Reinforcement Learning'. Together they form a unique fingerprint.

Cite this