Skip to content
This repository has been archived by the owner on Nov 17, 2023. It is now read-only.

Commit

Permalink
Small Change in R Coding Style
Browse files Browse the repository at this point in the history
This PR is a  very mall change, but can help make the coding style consistent. I used "<-" instead of "=" as it's more commonly recommended.
  • Loading branch information
XD-DENG committed Mar 30, 2016
1 parent 68890c2 commit 715b570
Show file tree
Hide file tree
Showing 4 changed files with 43 additions and 43 deletions.
12 changes: 6 additions & 6 deletions R-package/vignettes/classifyRealImageWithPretrainedModel.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -35,13 +35,13 @@ Make sure you unzip the pre-trained model in current folder. And we can use the
loading function to load the model into R.

```{r}
model = mx.model.load("Inception/Inception_BN", iteration=39)
model <- mx.model.load("Inception/Inception_BN", iteration = 39)
```

We also need to load in the mean image, which is used for preprocessing using ```mx.nd.load```.

```{r}
mean.img = as.array(mx.nd.load("Inception/mean_224.nd")[["mean_img"]])
mean.img <- as.array(mx.nd.load("Inception/mean_224.nd")[["mean_img"]])
```

Load and Preprocess the Image
Expand All @@ -52,7 +52,7 @@ from imager package. But you can always change it to other images.
Load and plot the image:

```{r, fig.align='center'}
im <- load.image(system.file("extdata/parrots.png", package="imager"))
im <- load.image(system.file("extdata/parrots.png", package = "imager"))
plot(im)
```

Expand All @@ -64,7 +64,7 @@ Because mxnet is deeply integerated with R, we can do all the processing in R fu
The preprocessing function:

```{r}
preproc.image <-function(im, mean.image) {
preproc.image <- function(im, mean.image) {
# crop the image
shape <- dim(im)
short.edge <- min(shape[1:2])
Expand All @@ -77,7 +77,7 @@ preproc.image <-function(im, mean.image) {
resized <- resize(croped, 224, 224)
# convert to array (x, y, channel)
arr <- as.array(resized)
dim(arr) = c(224, 224, 3)
dim(arr) <- c(224, 224, 3)
# substract the mean
normed <- arr - mean.img
# Reshape to format needed by mxnet (width, height, channel, num)
Expand All @@ -98,7 +98,7 @@ Now we are ready to classify the image! We can use the predict function
to get the probability over classes.

```{r}
prob <- predict(model, X=normed)
prob <- predict(model, X = normed)
dim(prob)
```

Expand Down
32 changes: 16 additions & 16 deletions R-package/vignettes/fiveMinutesNeuralNetwork.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -19,14 +19,14 @@ First of all, let us load in the data and preprocess it:
require(mlbench)
require(mxnet)
data(Sonar, package="mlbench")
Sonar[,61] = as.numeric(Sonar[,61])-1
train.ind = c(1:50, 100:150)
train.x = data.matrix(Sonar[train.ind, 1:60])
train.y = Sonar[train.ind, 61]
test.x = data.matrix(Sonar[-train.ind, 1:60])
test.y = Sonar[-train.ind, 61]
data(Sonar, package = "mlbench")
Sonar[,61] <- as.numeric(Sonar[,61])-1
train.ind <- c(1:50, 100:150)
train.x <- data.matrix(Sonar[train.ind, 1:60])
train.y <- Sonar[train.ind, 61]
test.x <- data.matrix(Sonar[-train.ind, 1:60])
test.y <- Sonar[-train.ind, 61]
```

Next we are going to use a multi-layer perceptron as our classifier. In `mxnet`, we have a function called `mx.mlp` so that users can build a general multi-layer neural network to do classification or regression.
Expand Down Expand Up @@ -59,8 +59,8 @@ graph.viz(model$symbol$as.json())
```

```{r}
preds = predict(model, test.x)
pred.label = max.col(t(preds))-1
preds <- predict(model, test.x)
pred.label <- max.col(t(preds)) - 1
table(pred.label, test.y)
```

Expand All @@ -73,11 +73,11 @@ Again, let us preprocess the data first.
```{r}
data(BostonHousing, package="mlbench")
train.ind = seq(1, 506, 3)
train.x = data.matrix(BostonHousing[train.ind, -14])
train.y = BostonHousing[train.ind, 14]
test.x = data.matrix(BostonHousing[-train.ind, -14])
test.y = BostonHousing[-train.ind, 14]
train.ind <- seq(1, 506, 3)
train.x <- data.matrix(BostonHousing[train.ind, -14])
train.y <- BostonHousing[train.ind, 14]
test.x <- data.matrix(BostonHousing[-train.ind, -14])
test.y <- BostonHousing[-train.ind, 14]
```

Although we can use `mx.mlp` again to do regression by changing the `out_activation`, this time we are going to introduce a flexible way to configure neural networks in `mxnet`. The configuration is done by the "Symbol" system in `mxnet`, which takes care of the links among nodes, the activation, dropout ratio, etc. To configure a multi-layer neural network, we can do it in the following way:
Expand Down Expand Up @@ -108,7 +108,7 @@ model <- mx.model.FeedForward.create(lro, X=train.x, y=train.y,
It is also easy to make prediction and evaluate

```{r}
preds = predict(model, test.x)
preds <- predict(model, test.x)
sqrt(mean((preds-test.y)^2))
```

Expand Down
12 changes: 6 additions & 6 deletions doc/R-package/classifyRealImageWithPretrainedModel.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,14 +73,14 @@ loading function to load the model into R.


```r
model = mx.model.load("Inception/Inception_BN", iteration=39)
model <- mx.model.load("Inception/Inception_BN", iteration=39)
```

We also need to load in the mean image, which is used for preprocessing using ```mx.nd.load```.


```r
mean.img = as.array(mx.nd.load("Inception/mean_224.nd")[["mean_img"]])
mean.img <- as.array(mx.nd.load("Inception/mean_224.nd")[["mean_img"]])
```

Load and Preprocess the Image
Expand All @@ -92,7 +92,7 @@ Load and plot the image:


```r
im <- load.image(system.file("extdata/parrots.png", package="imager"))
im <- load.image(system.file("extdata/parrots.png", package = "imager"))
plot(im)
```

Expand All @@ -107,7 +107,7 @@ The preprocessing function:


```r
preproc.image <-function(im, mean.image) {
preproc.image <- function(im, mean.image) {
# crop the image
shape <- dim(im)
short.edge <- min(shape[1:2])
Expand All @@ -120,7 +120,7 @@ preproc.image <-function(im, mean.image) {
resized <- resize(croped, 224, 224)
# convert to array (x, y, channel)
arr <- as.array(resized)
dim(arr) = c(224, 224, 3)
dim(arr) <- c(224, 224, 3)
# substract the mean
normed <- arr - mean.img
# Reshape to format needed by mxnet (width, height, channel, num)
Expand All @@ -143,7 +143,7 @@ to get the probability over classes.


```r
prob <- predict(model, X=normed)
prob <- predict(model, X = normed)
dim(prob)
```

Expand Down
30 changes: 15 additions & 15 deletions doc/R-package/fiveMinutesNeuralNetwork.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,14 +34,14 @@ require(mxnet)
```

```r
data(Sonar, package="mlbench")
data(Sonar, package = "mlbench")

Sonar[,61] = as.numeric(Sonar[,61])-1
train.ind = c(1:50, 100:150)
train.x = data.matrix(Sonar[train.ind, 1:60])
train.y = Sonar[train.ind, 61]
test.x = data.matrix(Sonar[-train.ind, 1:60])
test.y = Sonar[-train.ind, 61]
Sonar[,61] <- as.numeric(Sonar[,61])-1
train.ind <- c(1:50, 100:150)
train.x <- data.matrix(Sonar[train.ind, 1:60])
train.y <- Sonar[train.ind, 61]
test.x <- data.matrix(Sonar[-train.ind, 1:60])
test.y <- Sonar[-train.ind, 61]
```

Next we are going to use a multi-layer perceptron as our classifier. In `mxnet`, we have a function called `mx.mlp` so that users can build a general multi-layer neural network to do classification or regression.
Expand Down Expand Up @@ -102,15 +102,15 @@ graph.viz(model$symbol$as.json())
[<img src="https://raw.githubusercontent.com/dmlc/web-data/master/mxnet/knitr/graph.computation.png">](https://github.com/dmlc/mxnet)

```r
preds = predict(model, test.x)
preds <- predict(model, test.x)
```

```
## Auto detect layout of input matrix, use rowmajor..
```

```r
pred.label = max.col(t(preds))-1
pred.label <- max.col(t(preds)) - 1
table(pred.label, test.y)
```

Expand All @@ -131,11 +131,11 @@ Again, let us preprocess the data first.
```r
data(BostonHousing, package="mlbench")

train.ind = seq(1, 506, 3)
train.x = data.matrix(BostonHousing[train.ind, -14])
train.y = BostonHousing[train.ind, 14]
test.x = data.matrix(BostonHousing[-train.ind, -14])
test.y = BostonHousing[-train.ind, 14]
train.ind <- seq(1, 506, 3)
train.x <- data.matrix(BostonHousing[train.ind, -14])
train.y <- BostonHousing[train.ind, 14]
test.x <- data.matrix(BostonHousing[-train.ind, -14])
test.y <- BostonHousing[-train.ind, 14]
```

Although we can use `mx.mlp` again to do regression by changing the `out_activation`, this time we are going to introduce a flexible way to configure neural networks in `mxnet`. The configuration is done by the "Symbol" system in `mxnet`, which takes care of the links among nodes, the activation, dropout ratio, etc. To configure a multi-layer neural network, we can do it in the following way:
Expand Down Expand Up @@ -224,7 +224,7 @@ It is also easy to make prediction and evaluate


```r
preds = predict(model, test.x)
preds <- predict(model, test.x)
```

```
Expand Down

0 comments on commit 715b570

Please sign in to comment.