This is a quick two page guide I made back when I was at Penn State tutoring first and second year physics.
Externalities, Subsidies, and Taxpayer Cost of Fossil Fuels
There are many challenges to overcoming our dependency on fossil fuels. In his book, How the world really works, Vaclav Smil goes into detail how everything from the oil derived plastics used in lifesaving tools in the operating room to the coke that’s necessary for creating steel have no equivalent replacements. On the other hand, fossil fuels used in energy production and transportation do have alternatives which can yield significant decreases in greenhouse gas emissions sooner than later if readily adopted. Meanwhile, despite the many alternatives, we are instead, as taxpayers, fronting the costs to prop up the fossil fuel industries, paying for the damages and recovery of warming related weather events, and paying for the increasing health care costs from carbon extraction and processing operations all while they have raked in record profits.
In the latter half of his book, The Nutmeg’s Curse, Amitav Ghosh dives into the economics of fossil fuels, specifically oil, framing it as today’s nutmeg, which drives and props up the Anglosphere. He makes reference to the term homo economicus, as a descriptor of modern day citizens obsessed the notion that capitalism is the prime mover of the modern world and geopolitics, empires and power come into play after (Ghosh, 2021, p. 115). In the often-heated debates about climate change, sustainable economics and capitalistic friendly policy, the narrative against rapid adoption of renewables makes claims of high costs and economic burdens for taxpayers. Anti-renewable politicians, think-tanks, and other NGO organizations continually spin a narrative of the drawbacks, economic woes, and horrors of green energy. As ex-speaker McCarthy said “end the green giveaways that distort the market and waste taxpayers’ money,” back in April of 2023 during the debt ceiling limit debates (Zack Budryk, 2023).
In keeping with the idea of homo economicus, then we should explore the true cost of fossil fuels. In the free market system, the ideal is when signals from producers and consumers will play back and forth to create a sort of balance between supply and demand, an equilibrium of efficiency along a growth curve. When this balance isn’t achieved by the “invisible hand” of the market, then the market fails and requires intervention. The government has two tools for market correction, taxes, and subsidies. Taxes are used to reduce the negative impacts of externalities while subsidies are used to stabilize markets by shifting supply or demand curves to the equilibrium price point. In essence, a subsidy offsets expensive purchase prices or production costs and is usually provided by the government to stabilize the economy. We’ll come back to these definitions shortly, but first we’ll look at negative externalities in more detail.
Climate change is an externality, that is, it is a public good where the costs spill outside the market and are not captured in market prices. The most significant of all environmental externalities is global warming (Nordhaus, 2019), and the burning of fossil fuels is the largest contributor to the increase in greenhouse gasses (GHG) that drive global warming. When we say costs are not captured in market prices, it means that the money made by a producer of the goods or services causing the externality, in this case CO2 from fossil fuel burning. The profits are not used to redress the negative effects of increased hurricanes, hotter summers with longer, deadlier heatwaves, more wildfires, flooding, and the list goes on.
Each hurricane that makes landfall, atmospheric river that fuels extreme flash floods, hectares charred by wildfires, crops lost to droughts and fishery closures from heat induced die-offs all come with large price tags per event. Between 1989-2020 there have been 71 federally approved fishery disasters, events where a fishery has been deemed overfished and shutdown. To offset the estimated $3.2 billion (USD) in direct revenue losses, over $2 billion in Congressional allocations have been approved (Bellquist et al., 2021). That’s $2 billion in taxpayer dollars paying for the losses and recovery from extreme environmental events (marine heatwaves, hurricanes, harmful algal blooms). In a report by the National Resources Defense Fund, federal expenses on extreme weather events in 2012 exceeded $100 billion (Lashof & Stevenson, 2013, p. 4) with $96 billion in direct taxpayer dollars. Fast forward 10 years and the situation has only become worse.
As taxpayers continue to pay the cost of climate change related events, while also paying directly for fossil fuels to heat their homes, generate electricity and power their cars, they also pay to subsidize the fossil fuel companies. Global companies that brought in a combined $200 billion in profits for fiscal year 2022 (Grantham-Philips, n.d.). A true homo economicus, thinking rationally about what’s been discussed so far would hopefully be raising an eyebrow at the thought of subsidizing fossil fuel producers when they are already paying not only for the product, but then again to cover the negative externalities.
Despite the successful bottom line on the balance sheet, fossil fuel producers benefit from a variety of government subsidies such as (Fact Sheet | Fossil Fuel Subsidies: A Closer Look at Tax Breaks and Societal Costs (2019) | White Papers | EESI, n.d.) :
- Providing tax credits to oil producers based on the location they’re extracting or exporting to.
- Write offs for oil speculation and exploration costs.
- Consumption subsidies to reduce end user costs.
As it becomes more evident that a shift away from fossil fuels to renewable and alternative energy is the only way to secure a safe and viable future for generations to come, homo economicus would expect energy subsidies be used to further green and renewable sources. Instead, between 2010 and 2017 the U.S. Department of Energy (DOE) provided $2.66 billion (USD) towards fossil energy research and development projects. Of that total, 91% was research and development money (1.4 billion) used on coal-related research, including gasification (turning coal into gas). (Fact Sheet | Fossil Fuel Subsidies: A Closer Look at Tax Breaks and Societal Costs (2019) | White Papers | EESI, n.d.)
Artificially keeping the cost of fossil fuels low (though, with profits like above, are they as low as they can be?) through subsidies erodes incentive and motivation to invest in renewable and green energy, thus impeding innovation (Timperley, 2021). Elimination of fossil fuel subsidies would return an estimated $121 billion annually as federal revenue increases which in turn could be used to spur innovation in climate focused solutions to energy. (Fact Sheet | Proposals to Reduce Fossil Fuel Subsidies (2021) | White Papers | EESI, n.d.)
In the free-market system, rising fossil fuel prices should in turn decrease demand and spur innovation to increase the reliability, efficiency, and affordability of green and renewable sources. Instead, current policy, both domestic and globally, ensures the success of fossil fuels as an energy mainstay, with no regard for the true total cost of these policies. The first step to helping spark this correction is undoing the economic policy that is designed to benefit the producers. Easier said than done as we continually see pushback from house republicans on any attempts to promote clean energy initiatives and reduce fossil fuel production (House Republicans Pass Energy Bill to Roll Back Regulation of Fossil Fuel Production, 2023).
As U.S. direct subsidies reach an approximate $20.5 billion annually with $14.7 billion in federal subsidies and $5.8 billion in state subsidies, that’s a hefty amount of tax dollars. Now, combine that with the annual cost of negative externalities from fossil fuels, the new estimated U.S. subsidy amount is $649 billion per year (Fact Sheet | Proposals to Reduce Fossil Fuel Subsidies (2021) | White Papers | EESI, n.d.). At such a high cost, homo economicus should now really be convinced that it’s time for a policy change which makes better use of their tax dollars.
The 117th Congress has introduced a handful of legislation to help reform fossil fuel subsidies including the Ending Taxpayer Welfare for Oil and Gas Companies Act of 2021 (H.R. 1517) and the End Oil and Gas Tax Subsidies Act of 2012 (H.R. 2184). These bills were introduced in 2021 and have made little progress since. President Biden, and former President Obama have both made reforming and eliminating fossil fuel subsidies a part of their platforms, but neither have had success (“Biden Budget to Target U.S. Fossil Fuel Subsidies,” 2023).We are living on borrowed time economically, ecologically, and sociologically by continuing to use subsidies to prop up the fossil fuel sector. The continuation of fossil fuel subsidies through influence of big oil and acquiescent policy makers is a devastating setback in the progress towards large scale robust renewable energy solutions. I hope homo economicus would agree that our taxes should be put towards securing our welfare not just for today, but for our future. We need to secure the welfare of everyone, not just the CEOs of big oil and their lapdog politicians. I only hope more folks would truly adopt the homo economicus persona and take a true, rational, and academic approach to understanding these issues, in turn making well informed decisions at the polls to support those who support us.
References
Bellquist, L., Saccomanno, V., Semmens, B. X., Gleason, M., & Wilson, J. (2021). The rise in climate change-induced federal fishery disasters in the United States. PeerJ, 9, e11186. https://doi.org/10.7717/peerj.11186
Biden budget to target U.S. fossil fuel subsidies. (2023, March 9). Reuters. https://www.reuters.com/business/energy/biden-budget-target-us-fossil-fuel-subsidies-2023-03-09/
Fact Sheet | Fossil Fuel Subsidies: A Closer Look at Tax Breaks and Societal Costs (2019) | White Papers | EESI. (n.d.). Retrieved October 12, 2023, from https://www.eesi.org/papers/view/fact-sheet-fossil-fuel-subsidies-a-closer-look-at-tax-breaks-and-societal-costs
Fact Sheet | Proposals to Reduce Fossil Fuel Subsidies (2021) | White Papers | EESI. (n.d.). Retrieved October 12, 2023, from https://www.eesi.org/papers/view/fact-sheet-proposals-to-reduce-fossil-fuel-subsidies-2021
Ghosh, A. (2021). The nutmeg’s curse: Parables for a planet in crisis. University of Chicago Press.
Grantham-Philips, W. (n.d.). “Outrageous”: Big oil made almost $200 billion in 2022 as world faced energy crisis. Here’s the breakdown. USA TODAY. Retrieved October 26, 2023, from https://www.usatoday.com/story/money/at-home/2023/02/10/oil-companies-2022-profits-exxon-bp-shell/11170023002/
House Republicans pass energy bill to roll back regulation of fossil fuel production. (2023, March 30). PBS NewsHour. https://www.pbs.org/newshour/politics/house-republicans-pass-energy-bill-to-roll-back-regulation-of-fossil-fuel-production
Lashof, & Stevenson, A. (2013). Who Pays for Climate Change? (IP: 13-05A; p. 13). National Resources Defense Council (NRDC). https://www.nrdc.org/sites/default/files/taxpayer-climate-costs-IP.pdf
Nordhaus, W. (2019). Climate Change: The Ultimate Challenge for Economics. American Economic Review, 109(6), 1991–2014. https://doi.org/10.1257/aer.109.6.1991
Timperley, J. (2021). Why fossil fuel subsidies are so hard to kill. Nature, 598(7881), 403–405. https://doi.org/10.1038/d41586-021-02847-2
Zack Budryk, R. F. (2023, April 19). Republicans seek to repeal renewable tax credits, pass energy package in debt limit proposal [Text]. The Hill. https://thehill.com/policy/energy-environment/3959517-republicans-seek-to-repeal-renewable-tax-credits-pass-energy-package-in-debt-limit-proposal/
A Comparison of Machine Learning Regression Models for Feature Importance Determination of Climate Forcings in East Antarctic Sea Ice Summer Melt
Ryan Eagan
Submission Date: December 15th, 2023
A Report submitted for MAST638: Machine Learning for Marine Science, Fall 2023
Abstract
In the Adelie Sea region of East Antarctica, atmospheric and oceanographic forcing influence sea ice concentration, thickness, and summer sea ice melt, particularly in the local Dumont d’Urville (DDU) Sea. Previous work (Smith et al., 2011) demonstrated that the sea ice coverage in the DDU sea is out of phase with the overall patterns of sea ice concentration on the East Antarctic Coast near Adelie Land. However, the cause(s) of this variation has not yet been fully explored. Using ERA5 climate reanalysis data (2003-2022, gridded 0.25°x0.25° hourly) from the European Centre for Medium-Range Weather Forecasts (ECMWF), we construct a summertime (NDJF) climatology of the surface, atmospheric and oceanographic conditions at DDU and in the surrounding area (area in a box bounded by 120°E -54°S and 160°E -70°S). To gain insight into the local-to-regional processes that drive sea ice concentration and thickness, we compare eight supervised machine learning regression models to determine feature importance with respect to sea ice ablation. We find agreement in feature importance across five of the models. The best performing model, Random Forest regression, demonstrates a strong bias towards the sea surface temperature feature and minimizes all other features. Comparison of feature importance and model performance suggests further model tuning can provide conclusive results with respect to the importance and relationship of atmospheric forcings with sea ice concentration.
1 Introduction and Background
Satellite observations over the past 40 years have shown a marginal trend of increase in total Antarctic sea ice (Eayrs et al., 2019), though there is strong temporal and regional variability (Kusahara et al., 2019). The drivers of the variability are still not fully understood, but the role of local atmospheric forcing is considered to be a significant factor along with regional-scale atmospheric and oceanic processes (Eayrs et al., 2019). Though sea ice extent has been consistent, future projections and trend detection rely on a comprehensive understanding of the processes that drive interannual and seasonal sea ice cycles.
The sea ice off the Eastern Antarctic coast, particularly in and around the Dumont d’Urville Sea off the Terre Adelie coastline (between 139 and 146 °E and between 60 and 70°S), exemplifies this observed variability. The Dumont d’Urville region, which houses a permanent French research station on Petrel Island (66 39’ 48” S and 140 0’ 4” E) provides a unique opportunity to study the sea ice regime with a focus on investigating the interplay of atmospheric and oceanic forcings with the sea ice. Smith et al. (2011) analyzed sea ice concentration from 2003-2009 in the Dumont d’Urville Sea using AMSR-E remotely sensed data and demonstrated that the high seasonal and interannual variability is out of phase with the larger surrounding area.
To better understand the driving mechanisms of sea ice melt in this area, we compare supervised machine learning multivariate regression algorithms for identifying potential relationships between atmospheric variables and sea ice concentration through feature importance. Weekly mean sea ice concentration and atmospheric reanalysis data are regressed with the assumption that the relationship between the sea ice concentration and atmospheric variables are linear and independent.
2 Data and Methods
2.1 Study Period
Our analysis considers only the austral summer season which starts on November 15th and ends on February 15th. During this period, surface melt is the main ablation process.
2.2 Data
The regression analysis is built on a combination of remotely sensed sea ice data and ERA5 climate reanalysis datasets. ERA5 reanalysis data from the European Centre for Medium-Range Weather Forecasts (ECMWF) were obtained from the Copernicus Climate Data Store in NetCDF format, gridded at 0.25° x 0.25° (Hersbach et al., 2020). We obtained hourly values at 0:00, 6:00, 12:00 and 18:00 at single levels from November 15th, 2003 to February 15th2022 for the following variables: 2 meter surface temperature (t2m), sea surface temperature (sst), wind direction, windspeed magnitude, low cloud cover (lcc), and downwelling longwave radiation (ssrd). Relative humidity (r) was obtained from the corresponding pressure levels dataset at the 1000 millibar level. These variables are uses as the predictor inputs for our regression models.
Remotely sensed sea ice data, data set G10033, were retrieved from the National Ice Center for the full study area on a weekly basis from 2003 to 2022 (Fetterer & Stewart, 2020). The raster sea ice concentration is derived from U.S. National Ice Center (USNIC) sea ice charts based largely on satellite observations. The SIGRID-3 dataset has a resolution of 10 km x 10 km on the Equal-Area Scalable Earth (EASE) grid projection. The data provide sea ice concentration per grid cell as a percentage of sea ice content. We use the mid-range total concentration as the predicted value for our regression models.
2.3 Methods
2.3.1 Data Preprocessing and Feature Engineering
Sea ice data is reprojected to EPSG:4326 – WGS 84 coordinate system to match the ERA5 gridded data coordinate system to ensure the alignment of coordinates. The grid cells are then weighted as a function of latitude to correct for area. Additionally, we remove data for February 29th in any leap years so the data can be aligned by week number. Wind magnitude and direction is computed from the U and V wind vectors provided by the ERA5.
Dimensional reduction of the spatial data is accomplished by taking first the daily means over the entire spatial area, that is we average all grid cells and time respective hourly entries to get one mean for the study area per day. These are then averaged by week and assigned the corresponding ISO week number (1,2, 3,…52). To capture the austral summer, we start at week 45 carry through the new year end stop at week 8. This gives us data from the beginning of November through the end of February, which includes the season shoulders.
Outliers in the 5th and 95th percentile are removed, along with any weeks that are missing data before being split into 70% (N=123) training data and 30% (N=54) test data for the machine learning algorithms. The data is scaled using Scikit-learn’s StandardScaler which transforms the data to a z-score ( d where is the mean and is the standard deviation (Pedregosa et al., 2011). We confirm a close to normal distribution for each variable from a generated pair plot (Appendix, Figure A1) to justify the choice of the z-score standardization for the data. Pearson correlation coefficients were calculated for each variable with every other variable revealing some multicollinearity between surface temperature, relative humidity and sea surface temperature (Appendix, Figure A2).
2.3.3 Regression Algorithms
We compare eight different regression algorithms from the Scikit-learn Machine Learning in Python package: Ordinary Least-Squares (OLS), Elastic-Net (Elastic and ElasticCV), Random Forest (RF), Ridge Regression (RidgeCV), Lasso Regression (LassoCV), Bayesian Regression (Bayes) and Stochastic Gradient Decent Regression (SGD) (Pedregosa et al., 2011). Each regressor was run individually in its own machine learning pipeline which includes the StandardScaler transform and the regression model. No additionally preprocessing or regression ensembles were considered. Model performance was compared using the Mean Squared Error (MSE) and correlation coefficients (R2) calculated between the predicted sea ice concentration from the test data and respective true sea ice concentration. Each model was trained on the split training data.
Bayes, OLS, and SGD models were run as is with no additional hyperparameters provided to the model. These models were run as out of the box baselines for comparison to the RF, Lasso, Ridge and ElasticNet models. The Random Forest regression model was also run without changing the default hyperparameters. LassoCV, RidgeCV, and ElasticCV are all models that implement regularization and a corresponding cross-validation approach to optimize the regularization parameter alpha (Pedregosa et al., 2011). Each model was provided with an array of values from 1.0 to 0.00001 to explore in its selection process. ElasticCV uses an additional parameter as a ratio of the L1 and L2 regularization parameters (see results discussion for further details), of which 0.5 was used.
3 Results and Discussion
3.1 Regression Algorithm Performance
The results for each model are shown in Table 1, with the best performing model, RF, at the top and the worst performing model, SGD, at the bottom. Feature importance for each model is shown in Figure 1, where we see OLS, Bayesian and SGD all highlight close to the same feature importance, though their performance poor compared to the Random Forest algorithm.
MODEL | MAE | MSE | R2 | EVS |
Random Forest | 4.151 | 22.603 | 0.834 | 0.847 |
LassoCV | 4.604 | 28.785 | 0.788 | 0.796 |
ElasticCV | 4.602 | 30.159 | 0.778 | 0.788 |
RidgeCV | 4.620 | 30.508 | 0.776 | 0.783 |
Bayes | 4.623 | 30.522 | 0.775 | 0.785 |
OLS | 4.620 | 30.530 | 0.775 | 0.782 |
SGD | 4.659 | 31.285 | 0.770 | 0.780 |
Elastic | 4.619 | 31.737 | 0.766 | 0.780 |
Table 1 – Regression model performance from best to worst.
Figure 1: Model feature importance. Upper right shows RF (top) and LassoCV (below) feature selections. |
In all of our models we see sea surface temperature dominating the feature importance which is a direct representation of the sea ice surface temperature in the ERA5 data. We see a strong negative relationship in surface temp, sea surface temp, wind direction, wind magnitude and relative humidity in the feature importance for the OLS, Bayes, SGD, RidgeCV and ElasticCV models. Low cloud cover and downwelling longwave radiation show a small positive relationship, which suggests low cloud cover and resulting downwelling longwave radiation have a net effect of attenuating sea ice melt while our other forcing variables suggest they help to drive sea ice melt.
3.2 Random Forest Results
The Random Forest algorithm, without an hyperparameter adjustments performed well as an overall predictor to the data with the lowest MSE of all the models and best R2 at 82%. Despite this performance, the RF model favors sea surface temperature and minimizes all other features. This is most likely a result of sea surface temperature contributing the most value at each decision tree node and in turn, aggregating it’s weight as the most important feature which is a known issue with the impurity-based feature importance which are biased towards high cardinality features (Pedregosa et al., 2011). Further tuning the RF algorithm and selecting permutation-importance for the decision criteria may improve the RF feature selection and reduce the bias.
3.3 Ridge and Lasso
In addition to the RF algorithm which will merit further tuning, the RidgeCV, LassoCV, and ElasticCV regression models were of the most interest for their use of a regularization parameter. Each regression model, like OLS, uses the minimization of the residual sum of squares (RSS) as the cost function. Ridge and Lasso regression introduce a new parameter, L2 and L1 respectively, to the cost function. These parameters act to further penalize high feature coefficients which results in a more robust method of feature selection and importance determination.
The L1 regularization parameter is the 1-norm of the regression coefficients vector added to the cost function with a coefficient, lambda, that can be selected to weight the contribution of the L1 norm. The L2 parameter is the 2-norm of the regression coefficients vector, that is the squared sum of the coefficients (Raschka et al., 2022). Equations 1 and 2 show the L1 and L2 norms.
(1) | |
(2) |
The cost or loss function for the regression models is given by equation 3.
(3) |
RidgeCV utilizes the L2 norm as the regularization penalty, shown in equation 4, where is the tunable weight that determines the contribution of the L2 norm to the loss function (Pedregosa et al., 2011).
(4) |
Respectively for Lasso, we have the same but with the L1 norm, equation 5,
(5) |
and ElasticCV uses an additional hyperparameter which is a ratio that represents a combination of the two norms (Pedregosa et al., 2011).
Our RidgeCV and ElasticCV models performed better than OLS, Bayes, and SGD, but not as well the RF and LassoCV algorithms. LassoCV came in second best on performance but eliminated all but the sea surface (sst) and surface air temperature (t2m) features. Here we see that the L1 penalty zeroed out these features to improve the model performance, whereas with RidgeCV using the L2 norm and ElasticCV using a 0.5 ratio of both norms did not eliminate any features. The results of the LassoCV model demonstrate a similar bias expressed by the RF algorithm and suggest further investigation into hyperparameter tuning before additional conclusions can be made.
Each of these three models performed a cross-validation on five variations of the training set and optimized model performance over the range of 1.0 to 0.00001 for , all of which in turn converged on 1.0 as the best value. We also ran ElasticNet a single model without cross validation, , L1/L2 ratio = 0.75 for comparison. The heavy weight from the 0.75 ratio on the L1 norm eliminates all the same features as the Lasso regression except the downwelling longwave radiation feature, one that isn’t the strongest positive feature in the other models.
3.4 Linearity Challenges
In our consideration of multivariate linear regression models, we assume that the atmospheric variables used have a linear relationship with sea ice concentration. In initial exploration of the data, we see multicollinearity with surface temp (t2m), sea surface temp (sst) and relative humidity (r), which may pose challenges for some linear regression models. Our interest in Ridge and Lasso regression models stems from their ability to handle multicollinearity within the data (Raschka et al., 2022). What we have not considered is the interaction terms between our predictor variables and their relationships. Physical processes and feedback cycles do exist between low cloud cover, downwelling longwave radiation and humidity. These potential interaction terms may not be readily expressed in the linear models as configured and analyzed above and may benefit from exploring polynomial regression models to address this.
4 Conclusions
The Random Forest model shows the most potential for best fit and prediction, but the strong bias in the feature selection warrants further investigation and model tuning. The improved performance with the cross-validation models suggest further improvements may be achievable with a larger cross-validation. Five of the eight models we explored show a strong agreement in feature importance which suggests there’s motivation to continue our approach to determining potential weights for atmospheric forcing variables. Next steps will focus improvement of the Random Forest model to help minimize the bias in feature selection and see if the refined results agree with the other models.
As an initial first exploration of supervised machine learning for identifying strong forcing relationships between atmospheric processes and sea ice ablation, our results are still inconclusive, but show promise with further model refinement and analysis.
Appendices
Figure A1: Pair plot and distribution of the ERA5 reanalysis data and NSIDC sea ice concentration (tc_mid). |
Figure A2: Correlation matrix of ERA5 and sea ice variables. |
Figure A3: Actual vs predicted for the Random Forest algorithm. MSE: 22.603 and R2: 0.834 |
References
Eayrs, C., Holland, D., Francis, D., Wagner, T., Kumar, R., & Li, X. (2019). Understanding the Seasonal Cycle of Antarctic Sea Ice Extent in the Context of Longer‐Term Variability. Reviews of Geophysics, 57(3), 1037–1064. https://doi.org/10.1029/2018RG000631
Fetterer, U. S. N. I. C. C. by F., & Stewart, J. S. (2020). U.S. National Ice Center Arctic and Antarctic Sea Ice Concentration and Climatologies in Gridded Format, Version 1. National Snow and Ice Data Center. https://doi.org/10.7265/46cc-3952
Hersbach, H., Bell, B., Berrisford, P., Hirahara, S., Horányi, A., Muñoz‐Sabater, J., et al. (2020). The ERA5 global reanalysis. Quarterly Journal of the Royal Meteorological Society, 146(730), 1999–2049. https://doi.org/10.1002/qj.3803
Kusahara, K., Williams, G. D., Massom, R., Reid, P., & Hasumi, H. (2019). Spatiotemporal dependence of Antarctic sea ice variability to dynamic and thermodynamic forcing: a coupled ocean–sea ice model study. Climate Dynamics,52(7–8), 3791–3807. https://doi.org/10.1007/s00382-018-4348-3
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., et al. (2011). Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research, 12, 2825–2830.
Raschka, S., Liu, Y., Mirjalili, V., & Dzhulgakov, D. (2022). Machine learning with PyTorch and Scikit-Learn: develop machine learning and deep learning models with Python. Birmingham Mumbai: Packt.
Smith, M. B., Labat, J.-P., Fraser, A. D., Massom, R. A., & Koubbi, P. (2011). A GIS approach to estimating interannual variability of sea ice concentration in the Dumont d’Urville Sea near Terre Adélie from 2003 to 2009. Polar Science, 5(2), 104–117. https://doi.org/10.1016/j.polar.2011.04.007
A Bit on Neurodiversity
Advocacy for Atypical Thinkers
I wrote this short essay for an assignment in a social policy class back in 2012 or 2013 I think. It was during the first few years after my diagnosis when I was beginning to understand the paradigm of ableist culture.
Neurodiversity is a term coined by sociologist Judy Singer to describe conditions like autism, dyslexia and ADHD. Her hope was to begin to set a new tone in the everyday discourse about atypical ways of thinking that didn’t focus on the negative deficits and impairments of such disorders. Singer’s term was adopted quickly by the large and growing autism rights movement. Today the neurodiversity movement, though rooted in the autism rights movement, is a global movement that encompasses all of the following neurological differences: autism spectrum disorder (ASD) which also includes Aspergers’ Syndrome, dyspraxia, dyscalculia, dyslexia, tourettes and AD(H)D (neurodiversitysymposium).
Growing up I was an early victim of the assembly line public school system, immediately being labeled inattentive and disruptive by some teachers, while others reported they found me a delight in class and a model student. Despite the few positive words, the negative reports are what gained momentum in the chit-chatty teacher circles and before I even finished second grade, unbeknownst to me the next ten years of my academic life had now been pre-determined.
I had applied to community college after high school wanting to start in a general science curriculum, but upon examining my grades and test scores the admissions advisor told me I wasn’t smart enough for science and should consider something like broadcasting because I have a great personality. Needless to say, I barely survived two semesters of community college before completely dropping out.
I was diagnosed with ADD combined type at age 28, began taking medication and going to therapy to help undo the past 28 years of being told I was stupid or lazy along with learning how to move forward with a fresh outlook. I know from first hand experience how horrible an experience the world can be when you can’t perform within the established constraints set forth by today’s society.
The neurodiversity movement is an extension of the disability rights movement into the cognitive, affective and perceptual differences realm and is collectively represented by individuals, families, allies, advocates and organizations of the various disorders that fall under its umbrella. To explore the movement in more detail, I will focus on the ADHD sub-movement as a result of my personal involvement.
The primary return on advocating for neurodiversity is to ensure equal opportunity and rights for atypical thinkers and help to increase awareness and acceptance that will lead to larger scale changes where society can benefit from the many unique talents found within the atypical thinkers group. Ultimately our goal is to encourage society to appreciate and celebrate cognitive differences while asking for reasonable accommodations in areas like education and the work environment. Given the 10,154 articles on ADHD in the past year, one might be surprised to find that ADHD awareness and understanding are still very limited outside of the institutions, organizations, families and individuals who are directly affected by it. There are various organizations that exist to provide advocacy and support for the ADHD community including CHADD (Children and Adults with Attention-Deficit/Hyper-Activity Disorder), ADHDAware, ADDA (Attention Deficit Disorder Association) and help4adhd. These non-profit organizations are funded by donations from individual members and groups in return for providing information, resources and advertising space. CHADD provides resources and facilities throughout the nation where individuals can obtain support via volunteering professionals and support groups along with local advocacy resources.
The most recurring topic among the three websites mentioned in the previous section is that of dispelling ADHD myths, specifically that of what ADHD really is. For example:
Public perceptions of attention-deficit hyperactivity disorder (ADHD) are replete with myths, misconceptions and misinformation about the nature, course and treatment of the disorder. Popular misconceptions assert that ADHD is not a disorder or at minimum, is a benign one that is over- diagnosed. Critics often claim that children are needlessly medicated by parents who have not properly managed their unruly, unmotivated or underachieving children, or who are looking for an academic advantage(e.g., testing or classroom accommodations) in competitive, high-stakes educational environments.
The above quote comes from the help4adhd website and summarizes the most prevalent myth and awareness issue for ADHD and those who suffer from it. The site then builds their argument citing various research studies. Studies over the past 100 years demonstrate that ADHD is a chronic disorder that has a negative impact on virtually every aspect of daily social, emotional, academic and work functioning (Barkley, 1998). Dr. Russell Barkley is a research psychiatrist with the State University of New York (SUNY) Upstate Medical University who has devoted the past 40 years of his career to understanding ADHD. On their Myths and Misunderstandings page, help4adhd.org cites 26 different research papers to provide a varied background of information to support their answers and conclusions.
Despite the large number of references to scientific articles, research in the psychological, psychiatric and neurological fields is very challenging for many reasons which are beyond the scope of this paper. Keeping this in mind, there is always room to question with some inquisitive skepticism as to validity of all the claims made by the researchers. I know from reading Dr. Barkley’s papers, he clearly states the uncertainties in his studies that he is aware of, how they may skew the data and invites others to replicate his work as to help strengthen or dismiss some of this findings.
Autism, Aspergers, ADHD, Dyslexia, Dyspraxia, Tourette’s are neurological conditions that can’t be cured or corrected. Those of us who are born with these neurological differences are quite capable of contributing to society in a many great ways, but unfortunately many of us are lost and tossed aside because of the current rigid structure of society today that is highly unfriendly and unwelcoming to the non-neurotypical types.
For more on human centered design, accessibility and neurodiversity check out my Medium page.
References:
Barkley, R. A., (1998). Attention-deficit hyperactivity disorder: A handbook for diagnosis and treatment. New York: Guildford Press.
Tiny Timeline of Early Math
This was a project for a history of science class I took. I was rather happy with the final product, so why not share it!