Protein arginine methylation facilitates KCNQ channel-PIP2 interaction leading to seizure suppression
Abstract
KCNQ channels are critical determinants of neuronal excitability, thus emerging as a novel target of anti-epileptic drugs. To date, the mechanisms of KCNQ channel modulation have been mostly characterized to be inhibitory via Gq-coupled receptors, Ca2+/CaM, and protein kinase C. Here we demonstrate that methylation of KCNQ by protein arginine methyltransferase 1 (Prmt1) positively regulates KCNQ channel activity, thereby preventing neuronal hyperexcitability. Prmt1+/- mice exhibit epileptic seizures. Methylation of KCNQ2 channels at 4 arginine residues by Prmt1 enhances PIP2 binding, and Prmt1 depletion lowers PIP2 affinity of KCNQ2 channels and thereby the channel activities. Consistently, exogenous PIP2 addition to Prmt1+/- neurons restores KCNQ currents and neuronal excitability to the WT level. Collectively, we propose that Prmt1-dependent facilitation of KCNQ-PIP2 interaction underlies the positive regulation of KCNQ activity by arginine methylation, which may serve as a key target for prevention of neuronal hyperexcitability and seizures.
Article and author information
Author details
Funding
National Research Foundation of Korea (NRF-2012R1A2A2A01046878)
- Hyun-Ji Kim
- Seul-Yi Lee
- Hanna Kim
- Jewoo Koh
- Hana Cho
National Research Foundation of Korea (NRF-2015R1A2A1A15051998)
- Myong-Ho Jeong
- Tuan Anh Vuong
- Jong-Sun Kang
National Research Foundation of Korea (2015-048055)
- Kyung-Ran Kim
- Won-Kyung Ho
The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.
Reviewing Editor
- Indira M Raman, Northwestern University, United States
Ethics
Animal experimentation: All animal experiments were approved by the Institutional Animal Care and Research Advisory Committee at Sungkyunkwan University School of Medicine Laboratory Animal Research Center (Approval No. IACUC-11-39).
Version history
- Received: April 22, 2016
- Accepted: July 27, 2016
- Accepted Manuscript published: July 28, 2016 (version 1)
- Version of Record published: August 24, 2016 (version 2)
Copyright
© 2016, Kim et al.
This article is distributed under the terms of the Creative Commons Attribution License permitting unrestricted use and redistribution provided that the original author and source are credited.
Metrics
-
- 2,656
- views
-
- 696
- downloads
-
- 34
- citations
Views, downloads and citations are aggregated across all versions of this paper published by eLife.
Download links
Downloads (link to download the article as PDF)
Open citations (links to open the citations from this article in various online reference manager services)
Cite this article (links to download the citations from this article in formats compatible with various reference manager tools)
Further reading
-
- Neuroscience
Probing memory of a complex visual image within a few hundred milliseconds after its disappearance reveals significantly greater fidelity of recall than if the probe is delayed by as little as a second. Classically interpreted, the former taps into a detailed but rapidly decaying visual sensory or ‘iconic’ memory (IM), while the latter relies on capacity-limited but comparatively stable visual working memory (VWM). While iconic decay and VWM capacity have been extensively studied independently, currently no single framework quantitatively accounts for the dynamics of memory fidelity over these time scales. Here, we extend a stationary neural population model of VWM with a temporal dimension, incorporating rapid sensory-driven accumulation of activity encoding each visual feature in memory, and a slower accumulation of internal error that causes memorized features to randomly drift over time. Instead of facilitating read-out from an independent sensory store, an early cue benefits recall by lifting the effective limit on VWM signal strength imposed when multiple items compete for representation, allowing memory for the cued item to be supplemented with information from the decaying sensory trace. Empirical measurements of human recall dynamics validate these predictions while excluding alternative model architectures. A key conclusion is that differences in capacity classically thought to distinguish IM and VWM are in fact contingent upon a single resource-limited WM store.
-
- Neuroscience
Our ability to recall details from a remembered image depends on a single mechanism that is engaged from the very moment the image disappears from view.