Entropy
o Measures signal complexity
o EEG with low entropy is due to a small number of dominating processes
o EEG with high entropy is due to a large number of processes
o Relatively simple measure of complexity and system regularity
o Quantifies the predictability of subsequent amplitude values of the EEG based on the knowledge of previous amplitude values
o As a relative measure depends on three parameters
The length of the epoch
The length of the compared runs
The filtering level
o Approximate entropy and Shannon entropy are two entirely different measures
o Approximate entropy measures the predictability of future amplitude values of the EEG based on the one or two previous amplitude values
o Increasing anesthetic concentrations are associated with increasing EEG pattern regularity
o EEG approximate entropy decreases with increasing anesthetic concentration
o At high doses of anesthetics, periods of EEG silence with intermittent bursts of high frequencies occur
o For example median EEG frequency method fail to characterize concentrations because of these bursts
o Brain’s EEG approximation Entropy value is a good candidate for characterizing different extents of cerebral ischemic injury.
First, in the early stage of ischemia, the EEGs’ approximate entropy difference between ischemic region and normal region increase.
Second, after ischemia 18 minutes, the approximate entropy of ischemic region become lower than that before ischemia (normal state), which may indicate an emergent injury being induced.
Last, the approximate entropy of ischemic region (left brain) is lower than that of normal region (right brain).
No comments:
Post a Comment