Combining and comparing models using Model Confidence Set

By Gabriel Vasconcelos

In many cases, especially if you are dealing with forecasting models, it is natural to use a large set of models to forecast the same variable and then select the best model using some error measure. For example, you can break the sample into a training sample (in-sample) and a test sample (out-of-sample), estimate all models in the training sample and see how the perform in the test sample. You could compare the models using the root mean squared error (RMSE) or the mean absolute error (MAE).

Continue reading

Advertisements
Posted in R | Tagged , , , , , | 3 Comments

Pricing Optimization: How to find the price that maximizes your profit

By Yuri Fonseca

Basic idea

In this post we will discuss briefly about pricing optimization. The main idea behind this problem is the following question: As manager of a company/store, how much should I charge in order to maximize my revenue or profit?

Continue reading

Posted in R | Tagged , , , , , | 5 Comments

Treating your data: The old school vs tidyverse modern tools

By Gabriel Vasconcelos

When I first started using R there was no such thing as the tidyverse. Although some of the tidyverse packages were available independently, I learned to treat my data mostly using brute force combining pieces of information I had from several sources. It is very interesting to compare this old school programming with the tidyverse writing using the magrittr package. Even if you want to stay old school, tidyverse is here to stay and it is the first tool taught in many data science courses based on R.

Continue reading

Posted in R | Tagged , , , , , | 15 Comments

The package hdm for double selection inference with a simple example

By Gabriel Vasconcelos

In a late post I discussed the Double Selection (DS), a procedure for inference after selecting controls. I showed an example of the consequences of ignoring the variable selection step discussed in an article by Belloni, Chernozhukov and Hansen.

Continue reading

Posted in R | Tagged , , , , , | 3 Comments

Counterfactual estimation on nonstationary data, be careful!!!

By Gabriel Vasconcelos

In a recent paper, which can be downloaded here, Carvalho, Masini and Medeiros show that estimating counterfactuals in a non-stationary framework (when I say non-stationary it means integrated) is a tricky task. It is intuitive that the models will not work properly in the absence of cointegration (spurious case), but what the authors show is that even with cointegration, the average treatment effect (ATE) converges to a non-standard distribution. As a result, standard tests on the ATE will identify treatment effects in several cases when there is no effect at all.

Continue reading

Posted in R | Tagged , , , , , | 3 Comments

ArCo Package v 0.2 is on

The ArCo package 0.2 is now available on CRAN. The functions are now more user friendly. The new features are:

  • Default function for estimation if the user does not inform the functions fn and p.fn. The default model is Ordinary Least Squares.
  • The user can now add extra arguments to the fn function in the call.
  • The data will be automatically coerced when possible.
Posted in R | Tagged , , , | 2 Comments

Dealing with S3 methods in R with a simple example

By Gabriel Vasconcelos

S3 objects

R has three object systems: S3, S4 and RC. S3 is by far the easiest to work with and it can make you codes much understandable and organized, especially if you are working on a package. The idea is very simple. First we must define a class to some object in R and then we define methods (functions) for this class based on generic functions that you may create or use the ones available.

Continue reading

Posted in R | Tagged , , , , | 5 Comments