Economics
See recent articles
Showing new listings for Thursday, 14 November 2024
- [1] arXiv:2411.08053 [pdf, other]
-
Title: We must re-evaluate assumptions about carbon trading for effective climate change mitigationComments: 18 pages, 2 figuresSubjects: General Economics (econ.GN); Atmospheric and Oceanic Physics (physics.ao-ph); Applications (stat.AP)
Effective climate action depends on dismantling the assumptions and oversimplifications that have become the basis of climate policy. The assumption that greenhouse gases (GHG) are fungible and the use of single-point values in normalizing GHG species to CO2-equivalents can propagate inaccuracies in carbon accounting and have already led to failures of carbon offset systems. Separate emission reduction targets and tracking by GHG species are recommended to achieve long-term climate stabilization.
- [2] arXiv:2411.08067 [pdf, html, other]
-
Title: A note on the Cobb-Douglas functionComments: 2 pagesSubjects: General Economics (econ.GN)
This note observes that the Cobb-Douglas function is uniquely characterized by the property that, if the labour share of cost for a constant-returns-to-scale firm remains constant when the firm minimizes its cost for any given output level, then the firm's production function must be Cobb-Douglas.
- [3] arXiv:2411.08174 [pdf, other]
-
Title: Estimating Variability in Hospital Charges: The Case of Cesarean SectionSubjects: General Economics (econ.GN)
This study sought to better understand the causes of price disparity in cesarean sections, using newly released hospital data. Beginning January 1, 2021, Centers for Medicare and Medicaid Services (CMS) requires hospitals functioning in the United States to publish online pricing information for items and services these hospitals provide in a machine-readable format and a consumer friendly shoppable format. Initial analyses of these data have shown that the price for a given procedure can differ in a hospital and across hospitals. The cesarean section (C-section) is one of the most common inpatient procedures performed across all hospitals in the United States as of 2018. This preliminary study found that for C-section procedures, pricing varied from as little as \$162 to as high as \$115,483 for a single procedure. Overall, indicators for quality and whether or not the hospital was a teaching hospital were found to be significantly significant, while variables including median income and the gini coefficient for wealth inequality were not shown to be statistically significant.
- [4] arXiv:2411.08263 [pdf, html, other]
-
Title: On the Welfare (Ir)Relevance of Two-Stage ModelsSubjects: Theoretical Economics (econ.TH)
In a two-stage model of choice a decision maker first shortlists a given menu and then applies her preferences. We show that a sizeable class of these models run into significant issues in terms of identification of preferences (welfare-relevance) and thus cannot be used for welfare analysis. We classify these models by their revealed preference principles and expose the principle that we deem to be the root of their identification issue. Taking our analysis to an experimental data, we observe that half of the alternatives that are revealed preferred to another under rational choice are left revealed preferred to nothing for any member of this class of models. Furthermore, the welfare-relevance of the specific models established in the literature are much worse. The model with the highest welfare-relevance produces a revealed preference relation with the average density of 2% (1 out of 45 possible comparisons revealed), while rational choice does 63% (28 out of 45 possible comparisons). We argue that the issue is not an inherent feature of two-stage models, and rather lies in the approach with which the first stage is modelled in the literature.
- [5] arXiv:2411.08350 [pdf, other]
-
Title: How Transit Countries Become Refugee Destinations: Insights from Central and Eastern EuropeComments: 22 pages, 2 figuresSubjects: General Economics (econ.GN)
This study explores how refugees' destination preferences evolve during transit, with a focus on Central and Eastern Europe, particularly Romania. Using a mixed-methods approach, we analyse data from the International Organization for Migration's (IOM) Flow Monitoring Surveys and complement it with qualitative insights from focus group discussions with refugees. The quantitative analysis reveals that refugees' preferences for destination countries often change during transit, influenced by factors such as safety concerns, asylum conditions, education, and the presence of relatives at the destination. Our results support the application of bounded rationality and human capital theory, showing that while economic opportunities are important, safety becomes the dominant concern during transit. The qualitative analysis adds depth to these findings, highlighting the role of political instability, social networks, and economic hardships as initial migration drivers. Additionally, the study reveals how refugees reassess their destination choices based on their experiences in transit countries, with Romania emerging as a viable settlement destination due to its relative stability and access to asylum procedures. This research contributes to migration studies by challenging the traditional view of transit countries and offering new insights into the fluid nature of refugee decision-making.
- [6] arXiv:2411.08419 [pdf, html, other]
-
Title: Orchestrating Organizational Politics: Baron and Ferejohn Meet TullockSubjects: Theoretical Economics (econ.TH)
This paper examines the optimal organizational rules that govern the process of dividing a fixed surplus. The process is modeled as a sequential multilateral bargaining game with costly recognition. The designer sets the voting rule -- i.e., the minimum number of votes required to approve a proposal -- and the mechanism for proposer recognition, which is modeled as a biased generalized lottery contest. We show that for diverse design objectives, the optimum can be achieved by a dictatorial voting rule, which simplifies the game into a standard biased contest model.
- [7] arXiv:2411.08452 [pdf, other]
-
Title: Complementing Carbon Credits from Forest-Related Activities with Biodiversity Insurance and Resilience ValueSubjects: General Economics (econ.GN)
Carbon credits are a key component of most national and organizational climate strategies. Financing and delivering carbon credits from forest-related activities faces multiple risks at the project and asset levels. Financial mechanisms are employed to mitigate risks for investors and project developers, complemented by non-financial measures such as environmental and social safeguards and physical risk mitigation. Despite these efforts, academic research highlights that safeguards and climate risk mitigation measures are not efficiently implemented in some carbon projects and that specification of environmental safeguards remains underdeveloped. Further, environmental and social risk mitigation capacities may not be integrated into financial mechanisms. This text examines how ecosystem capacities can be leveraged and valued for mitigation of and adaptation to physical risks by complementing carbon credits with biodiversity insurance and resilience value.
- [8] arXiv:2411.08471 [pdf, html, other]
-
Title: Equilibrium Cycle: A "Dynamic" EquilibriumSubjects: Theoretical Economics (econ.TH)
The Nash equilibrium (NE) is fundamental game-theoretic concept for characterizing stability in static strategic form games. However, at times, NE fails to capture outcomes in dynamic settings, where players' actions evolve over time in response to one another. In such cases, game dynamics fail to converge to an NE, instead exhibiting cyclic or oscillatory patterns. To address this, we introduce the concept of an 'equilibrium cycle' (EC). Unlike NE, which defines a fixed point of mutual best responses, an EC is a set-valued solution concept designed to capture the asymptotic or long-term behavior of dynamic interactions, even when a traditional best response does not exist. The EC identifies a minimal rectangular set of action profiles that collectively capture oscillatory game dynamics, effectively generalizing the notion of stability beyond static equilibria. An EC satisfies three important properties: \textit{stability} against external deviations (ensuring robustness), \textit{unrest} with respect to internal deviations (driving oscillation), and \textit{minimality} (defining the solution's tightness). This set-valued outcome generalizes the minimal curb set to discontinuous games, where best responses may not exist. In finite games, the EC also relates to sink strongly connected components (SCCs) of the best response graph.
- [9] arXiv:2411.08483 [pdf, html, other]
-
Title: Industrial symbiosis: How to apply successfullySubjects: Theoretical Economics (econ.TH)
The premise of industrial symbiosis IS is that advancing a circular economy that reuses byproducts as inputs in production is valuable for the environment. We challenge this premise in a simple model. Ceteris paribus, IS is an environmentally friendly approach; however, implementing IS may introduce increased pollution into the market equilibrium. The reason for this is that producers' incentives for recycling can be triggered by the income gained from selling recycled waste in the secondary market, and thereby may not align with environmental protection. That is, producers may boost production and subsequent pollution to sell byproducts without internalizing the pollution emitted in the primary industry or the recycling process. We compare the market solution to the social optimum and identify a key technology parameter - the share of reused byproducts that may have mutual benefits for firms, consumers, and the environment.
- [10] arXiv:2411.08601 [pdf, html, other]
-
Title: Does the Gini index represent people's views on inequality?Subjects: General Economics (econ.GN)
This paper presents findings from a web-experiment on a representative sample of the French population. It examines the acceptability of the Pigou-Dalton principle of transfers, which posits that transferring income from an individual to a relatively poorer one, reduces overall inequality. While up to 60% of respondents reject standard transfers, the three alternative transfers we test receive more approval, especially those promoting solidarity among lower-income recipients. The study then models respondents' preferences with two types of social welfare functions, utilitarian and Extended Gini. The Extended Gini model aligns better with individual preferences. Nevertheless, Extended Gini-type social welfare functions that adhere to the principle of transfers (including the one underlying the Gini index) poorly capture preferences of each individual. However, quite surprisingly, the preferences of the median individual align almost perfectly with the Gini-based function, using either parametric or non-parametric estimates.
- [11] arXiv:2411.08668 [pdf, html, other]
-
Title: A Machine Learning Algorithm for Finite-Horizon Stochastic Control Problems in EconomicsComments: arXiv admin note: substantial text overlap with arXiv:1611.01767Subjects: General Economics (econ.GN); Optimization and Control (math.OC); Machine Learning (stat.ML)
We propose a machine learning algorithm for solving finite-horizon stochastic control problems based on a deep neural network representation of the optimal policy functions. The algorithm has three features: (1) It can solve high-dimensional (e.g., over 100 dimensions) and finite-horizon time-inhomogeneous stochastic control problems. (2) It has a monotonicity of performance improvement in each iteration, leading to good convergence properties. (3) It does not rely on the Bellman equation. To demonstrate the efficiency of the algorithm, it is applied to solve various finite-horizon time-inhomogeneous problems including recursive utility optimization under a stochastic volatility model, a multi-sector stochastic growth, and optimal control under a dynamic stochastic integration of climate and economy model with eight-dimensional state vectors and 600 time periods.
- [12] arXiv:2411.08826 [pdf, html, other]
-
Title: The Structure of the U.S. Income DistributionComments: 33 pagesSubjects: General Economics (econ.GN)
I show that U.S. incomes follow a one-parameter family of probability distributions over more than fifty years of data. I compare statistical models of income, and I highlight the inverse-gamma distribution as a parsimonious model that matches data particularly well and has straightforward theoretical interpretations. However, despite having relatively few parameters, the inverse-gamma distribution still overfits income data. I establish a linear relationship between parameter estimates, and a one-dimensional model emerges naturally when I exploit this relationship. I conclude with theoretical remarks about the model.
New submissions (showing 12 of 12 entries)
- [13] arXiv:2411.07031 (cross-list from cs.AI) [pdf, other]
-
Title: Evaluating the Accuracy of Chatbots in Financial LiteratureSubjects: Artificial Intelligence (cs.AI); Econometrics (econ.EM)
We evaluate the reliability of two chatbots, ChatGPT (4o and o1-preview versions), and Gemini Advanced, in providing references on financial literature and employing novel methodologies. Alongside the conventional binary approach commonly used in the literature, we developed a nonbinary approach and a recency measure to assess how hallucination rates vary with how recent a topic is. After analyzing 150 citations, ChatGPT-4o had a hallucination rate of 20.0% (95% CI, 13.6%-26.4%), while the o1-preview had a hallucination rate of 21.3% (95% CI, 14.8%-27.9%). In contrast, Gemini Advanced exhibited higher hallucination rates: 76.7% (95% CI, 69.9%-83.4%). While hallucination rates increased for more recent topics, this trend was not statistically significant for Gemini Advanced. These findings emphasize the importance of verifying chatbot-provided references, particularly in rapidly evolving fields.
- [14] arXiv:2411.08188 (cross-list from stat.ME) [pdf, html, other]
-
Title: MSTest: An R-Package for Testing Markov Switching ModelsSubjects: Methodology (stat.ME); Econometrics (econ.EM); Computation (stat.CO)
We present the R package MSTest, which implements hypothesis testing procedures to identify the number of regimes in Markov switching models. These models have wide-ranging applications in economics, finance, and numerous other fields. The MSTest package includes the Monte Carlo likelihood ratio test procedures proposed by Rodriguez-Rondon and Dufour (2024), the moment-based tests of Dufour and Luger (2017), the parameter stability tests of Carrasco, Hu, and Ploberger (2014), and the likelihood ratio test of Hansen (1992). Additionally, the package enables users to simulate and estimate univariate and multivariate Markov switching and hidden Markov processes, using the expectation-maximization (EM) algorithm or maximum likelihood estimation (MLE). We demonstrate the functionality of the MSTest package through both simulation experiments and an application to U.S. GNP growth data.
- [15] arXiv:2411.08491 (cross-list from stat.ME) [pdf, other]
-
Title: Covariate Adjustment in Randomized Experiments Motivated by Higher-Order Influence FunctionsComments: 62 pages, 8 figuresSubjects: Methodology (stat.ME); Econometrics (econ.EM); Statistics Theory (math.ST)
Higher-Order Influence Functions (HOIF), developed in a series of papers over the past twenty years, is a fundamental theoretical device for constructing rate-optimal causal-effect estimators from observational studies. However, the value of HOIF for analyzing well-conducted randomized controlled trials (RCT) has not been explicitly explored. In the recent US Food \& Drug Administration (FDA) and European Medical Agency (EMA) guidelines on the practice of covariate adjustment in analyzing RCT, in addition to the simple, unadjusted difference-in-mean estimator, it was also recommended to report the estimator adjusting for baseline covariates via a simple parametric working model, such as a linear model. In this paper, we show that an HOIF-motivated estimator for the treatment-specific mean has significantly improved statistical properties compared to popular adjusted estimators in practice when the number of baseline covariates $p$ is relatively large compared to the sample size $n$. We also characterize the conditions under which the HOIF-motivated estimator improves upon the unadjusted estimator. Furthermore, we demonstrate that a novel debiased adjusted estimator proposed recently by Lu et al. is, in fact, another HOIF-motivated estimator under disguise. Finally, simulation studies are conducted to corroborate our theoretical findings.
- [16] arXiv:2411.08720 (cross-list from q-fin.TR) [pdf, other]
-
Title: How Wash Traders Exploit Market Conditions in Cryptocurrency MarketsComments: 14 pages main textSubjects: Trading and Market Microstructure (q-fin.TR); General Economics (econ.GN)
Wash trading, the practice of simultaneously placing buy and sell orders for the same asset to inflate trading volume, has been prevalent in cryptocurrency markets. This paper investigates whether wash traders in Bitcoin act deliberately to exploit market conditions and identifies the characteristics of such manipulative behavior. Using a unique dataset of 18 million transactions from Mt. Gox, once the largest Bitcoin exchange, I find that wash trading intensifies when legitimate trading volume is low and diminishes when it is high, indicating strategic timing to maximize impact in less liquid markets. The activity also exhibits spillover effects across platforms and decreases when trading volumes in other asset classes like stocks or gold rise, suggesting sensitivity to broader market dynamics. Additionally, wash traders exploit periods of heightened media attention and online rumors to amplify their influence, causing rapid but short-lived spikes in legitimate trading volume. Using an exogenous demand shock associated with illicit online marketplaces, I find that wash trading responds to contemporaneous events affecting Bitcoin demand. These results advance the understanding of manipulative practices in digital currency markets and have significant implications for regulators aiming to detect and prevent wash trading.
Cross submissions (showing 4 of 4 entries)
- [17] arXiv:2206.04902 (replaced) [pdf, html, other]
-
Title: Forecasting macroeconomic data with Bayesian VARs: Sparse or dense? It depends!Subjects: Econometrics (econ.EM); Applications (stat.AP); Methodology (stat.ME)
Vector autogressions (VARs) are widely applied when it comes to modeling and forecasting macroeconomic variables. In high dimensions, however, they are prone to overfitting. Bayesian methods, more concretely shrinkage priors, have shown to be successful in improving prediction performance. In the present paper, we introduce the semi-global framework, in which we replace the traditional global shrinkage parameter with group-specific shrinkage parameters. We show how this framework can be applied to various shrinkage priors, such as global-local priors and stochastic search variable selection priors. We demonstrate the virtues of the proposed framework in an extensive simulation study and in an empirical application forecasting data of the US economy. Further, we shed more light on the ongoing ``Illusion of Sparsity'' debate, finding that forecasting performances under sparse/dense priors vary across evaluated economic variables and across time frames. Dynamic model averaging, however, can combine the merits of both worlds.
- [18] arXiv:2304.11415 (replaced) [pdf, html, other]
-
Title: Strategic Responses to Personalized Pricing and Demand for Privacy: An ExperimentSubjects: General Economics (econ.GN)
We consider situations where consumers are aware that a statistical model determines the price of a product based on their observed behavior. Using a novel experiment varying the context similarity between participant data and a product, we find that participants manipulate their responses to a survey about personal characteristics, and manipulation is more successful when the contexts are similar. Moreover, participants demand less privacy, and make less optimal privacy choices when the contexts are less similar. Our findings highlight the importance of data privacy policies in the age of big data, where behavior in seemingly unrelated contexts might affect prices.
- [19] arXiv:2401.01804 (replaced) [pdf, html, other]
-
Title: Efficient Computation of Confidence Sets Using Classification on Equidistributed GridsSubjects: Econometrics (econ.EM); Machine Learning (stat.ML)
Economic models produce moment inequalities, which can be used to form tests of the true parameters. Confidence sets (CS) of the true parameters are derived by inverting these tests. However, they often lack analytical expressions, necessitating a grid search to obtain the CS numerically by retaining the grid points that pass the test. When the statistic is not asymptotically pivotal, constructing the critical value for each grid point in the parameter space adds to the computational burden. In this paper, we convert the computational issue into a classification problem by using a support vector machine (SVM) classifier. Its decision function provides a faster and more systematic way of dividing the parameter space into two regions: inside vs. outside of the confidence set. We label those points in the CS as 1 and those outside as -1. Researchers can train the SVM classifier on a grid of manageable size and use it to determine whether points on denser grids are in the CS or not. We establish certain conditions for the grid so that there is a tuning that allows us to asymptotically reproduce the test in the CS. This means that in the limit, a point is classified as belonging to the confidence set if and only if it is labeled as 1 by the SVM.
- [20] arXiv:2403.07799 (replaced) [pdf, other]
-
Title: Equitable AuctionsSubjects: Theoretical Economics (econ.TH); Computer Science and Game Theory (cs.GT)
We initiate the study of how auction design affects the division of surplus among buyers. We propose a parsimonious measure for equity and apply it to the family of standard auctions for homogeneous goods. Our surplus-equitable mechanism is efficient, Bayesian-Nash incentive compatible, and achieves surplus parity among winners ex-post. The uniform-price auction is equity-optimal if and only if buyers have a pure common value. Against intuition, the pay-as-bid auction is not always preferred in terms of equity if buyers have pure private values. In auctions with price mixing between pay-as-bid and uniform prices, we provide prior-free bounds on the equity-preferred pricing rule under a common regularity condition on signals.
- [21] arXiv:2403.11010 (replaced) [pdf, other]
-
Title: How Periodic Forecast Updates Influence MRP Planning Parameters: A Simulation StudySubjects: General Economics (econ.GN)
In many supply chains, the current efforts at digitalization have led to improved information exchanges between manufacturers and their customers. Specifically, demand forecasts are often provided by the customers and regularly updated as the related customer information improves. In this paper, we investigate the influence of forecast updates on the production planning method of Material Requirements Planning (MRP). A simulation study was carried out to assess how updates in information affect the setting of planning parameters in a rolling horizon MRP planned production system. An intuitive result is that information updates lead to disturbances in the production orders for the MRP standard, and, therefore, an extension for MRP to mitigate these effects is developed. A large numerical simulation experiment shows that the MRP safety stock exploitation heuristic, that has been developed, leads to significantly improved results as far as inventory and backorder costs are concerned. An interesting result is that the fixed-order-quantity lotsizing policy performs - in most instances - better than the fixed-order-period lotsizing policy, when periodic forecast updates occur. In addition, the simulation study shows that underestimating demand is marginally more costly than overestimating it, based on the comparative analysis of all instances. Furthermore, the results indicate that the MRP safety stock exploitation heuristic can mitigate the negative effects of biased forecasts.
- [22] arXiv:2404.12988 (replaced) [pdf, html, other]
-
Title: How Gender and Birth Order Affect Educational attainment Inequality within-Families: Evidence from BeninSubjects: General Economics (econ.GN)
This paper examines how gender, birth order, and innate ability shape within-household disparities in children's educational attainment in developing countries. Using data from Benin, I find that in households with non-educated parents, gender and birth order drive over two-thirds of the average educational attainment disparities among adult children, while their influence decreases to one-third in households with college-educated parents. Furthermore, average inequality, measured by the range of children's educational attainment is twice as high among non-educated parents compared to college-educated parents. I propose and estimate a structural model of educational attainment choices within-family. Using the model, I show that the absence of gender and birth order effects does not lead to a significant reduction in the average within-family disparities in children's educational attainment. Additionally, in theory, ensuring that every child has at least one year of education lowers average within-family educational inequality. Yet, even in this scenario, daughters tend to receive less education than sons, and practical efforts to achieve universal entry are less effective than the theoretical model.
- [23] arXiv:2407.15339 (replaced) [pdf, html, other]
-
Title: Deep Learning for EconomistsSubjects: General Economics (econ.GN); Computation and Language (cs.CL); Computer Vision and Pattern Recognition (cs.CV)
Deep learning provides powerful methods to impute structured information from large-scale, unstructured text and image datasets. For example, economists might wish to detect the presence of economic activity in satellite images, or to measure the topics or entities mentioned in social media, the congressional record, or firm filings. This review introduces deep neural networks, covering methods such as classifiers, regression models, generative AI, and embedding models. Applications include classification, document digitization, record linkage, and methods for data exploration in massive scale text and image corpora. When suitable methods are used, deep learning models can be cheap to tune and can scale affordably to problems involving millions or billions of data points.. The review is accompanied by a companion website, EconDL, with user-friendly demo notebooks, software resources, and a knowledge base that provides technical details and additional applications.
- [24] arXiv:2411.03116 (replaced) [pdf, html, other]
-
Title: Generative AI and Security Operations Center Productivity: Evidence from Live OperationsSubjects: General Economics (econ.GN)
We measure the association between generative AI (GAI) tool adoption and security operations center productivity. We find that GAI adoption is associated with a 30.13% reduction in security incident mean time to resolution. This result is robust to several modeling decisions. While unobserved confounders inhibit causal identification, this result is among the first to use observational data from live operations to investigate the relationship between GAI adoption and security worker productivity.
- [25] arXiv:2411.05567 (replaced) [pdf, other]
-
Title: Workers as Partners: a Theory of Responsible Firms in Labor MarketsSubjects: General Economics (econ.GN)
We develop a theoretical framework analyzing responsible firms (REFs) that prioritize worker welfare alongside profits in labor markets with search frictions. At the micro level, REFs' use of market power varies with labor conditions: they refrain from using it in slack markets but may exercise it in tight markets without harming workers. Our macro analysis shows these firms offer higher wages, creating a distinct high-wage sector. When firms endogenously choose worker bargaining power, there is a trade-off between worker surplus and employment, though this improves with elastic labor supply. While REFs cannot survive with free entry, they can coexist with profit-maximizing firms under limited competition, where their presence forces ordinary firms to raise wages.
- [26] arXiv:2104.04716 (replaced) [pdf, other]
-
Title: Selecting Penalty Parameters of High-Dimensional M-Estimators using Bootstrapping after Cross-ValidationComments: 164 pages, 14 figuresSubjects: Statistics Theory (math.ST); Econometrics (econ.EM)
We develop a new method for selecting the penalty parameter for $\ell_{1}$-penalized M-estimators in high dimensions, which we refer to as bootstrapping after cross-validation. We derive rates of convergence for the corresponding $\ell_1$-penalized M-estimator and also for the post-$\ell_1$-penalized M-estimator, which refits the non-zero entries of the former estimator without penalty in the criterion function. We demonstrate via simulations that our methods are not dominated by cross-validation in terms of estimation errors and can outperform cross-validation in terms of inference. As an empirical illustration, we revisit Fryer Jr (2019), who investigated racial differences in police use of force, and confirm his findings.
- [27] arXiv:2406.04191 (replaced) [pdf, html, other]
-
Title: Strong Approximations for Empirical Processes Indexed by Lipschitz FunctionsSubjects: Statistics Theory (math.ST); Econometrics (econ.EM); Probability (math.PR); Methodology (stat.ME)
This paper presents new uniform Gaussian strong approximations for empirical processes indexed by classes of functions based on $d$-variate random vectors ($d\geq1$). First, a uniform Gaussian strong approximation is established for general empirical processes indexed by possibly Lipschitz functions, improving on previous results in the literature. In the setting considered by Rio (1994), and if the function class is Lipschitzian, our result improves the approximation rate $n^{-1/(2d)}$ to $n^{-1/\max\{d,2\}}$, up to a $\operatorname{polylog}(n)$ term, where $n$ denotes the sample size. Remarkably, we establish a valid uniform Gaussian strong approximation at the rate $n^{-1/2}\log n$ for $d=2$, which was previously known to be valid only for univariate ($d=1$) empirical processes via the celebrated Hungarian construction (Komlós et al., 1975). Second, a uniform Gaussian strong approximation is established for multiplicative separable empirical processes indexed by possibly Lipschitz functions, which addresses some outstanding problems in the literature (Chernozhukov et al., 2014, Section 3). Finally, two other uniform Gaussian strong approximation results are presented when the function class is a sequence of Haar basis based on quasi-uniform partitions. Applications to nonparametric density and regression estimation are discussed.
- [28] arXiv:2409.01911 (replaced) [pdf, html, other]
-
Title: Variable selection in convex nonparametric least squares via structured Lasso: An application to the Swedish electricity distribution networksSubjects: Methodology (stat.ME); Econometrics (econ.EM)
We study the problem of variable selection in convex nonparametric least squares (CNLS). Whereas the least absolute shrinkage and selection operator (Lasso) is a popular technique for least squares, its variable selection performance is unknown in CNLS problems. In this work, we investigate the performance of the Lasso estimator and find out it is usually unable to select variables efficiently. Exploiting the unique structure of the subgradients in CNLS, we develop a structured Lasso method by combining $\ell_1$-norm and $\ell_{\infty}$-norm. The relaxed version of the structured Lasso is proposed for achieving model sparsity and predictive performance simultaneously, where we can control the two effects--variable selection and model shrinkage--using separate tuning parameters. A Monte Carlo study is implemented to verify the finite sample performance of the proposed approaches. We also use real data from Swedish electricity distribution networks to illustrate the effects of the proposed variable selection techniques. The results from the simulation and application confirm that the proposed structured Lasso performs favorably, generally leading to sparser and more accurate predictive models, relative to the conventional Lasso methods in the literature.