Skip to content

Commit

Permalink
Merge pull request #284 from ioam/dataframe
Browse files Browse the repository at this point in the history
Data API for flexible data manipulation and access
  • Loading branch information
jlstevens committed Nov 9, 2015
2 parents 8e3cedc + 3fc7036 commit e91b018
Show file tree
Hide file tree
Showing 41 changed files with 2,529 additions and 1,142 deletions.
26 changes: 4 additions & 22 deletions doc/Tutorials/Pandas_Conversion.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -140,24 +140,6 @@
"HTML(df.reset_index().to_html())"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For now though, the a, b, and c columns is all we'll need. To confirm the dataframe was converted correctly we can call the `.info` method on Table:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false
},
"outputs": [],
"source": [
"example_table.info"
]
},
{
"cell_type": "markdown",
"metadata": {},
Expand Down Expand Up @@ -267,7 +249,7 @@
},
"outputs": [],
"source": [
"macro_df = pd.read_csv('http://ioam.github.com/holoviews/Tutorials/macro.csv', sep='\\t')"
"macro_df = pd.read_csv('http://ioam.github.com/holoviews/Tutorials/macro.csv', '\\t')"
]
},
{
Expand Down Expand Up @@ -404,7 +386,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Now that we've extracted the gdp_curves we can apply some operations to them. The collapse method applies some function across the data along the supplied dimensions. This lets us quickly compute the mean GDP Growth by year for example, but it also allows us to map a function with parameters to the data and visualize the resulting samples. A simple example is computing a curve for each percentile and embedding it in an NdOverlay.\n",
"Now that we've extracted the gdp_curves we can apply some operations to them. The collapse method applies some function across the data along the supplied dimensions. This let's us quickly compute a the mean GDP Growth by year for example, but it also allows us to map a function with parameters to the data and visualize the resulting samples. A simple example is computing a curve for each percentile and embedding it in an NdOverlay.\n",
"\n",
"Additionally we can apply a Palette to visualize the range of percentiles."
]
Expand Down Expand Up @@ -462,9 +444,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Using the .select method we can pull out the data for just a few countries and specific years. We can also make more advanced use of the Palettes.\n",
"Using the .select method we can pull out the data for just a few countries and specific years. We can also make more advanced use the Palettes.\n",
"\n",
"Palettes can be customized by selecting only a subrange of the underlying cmap to draw the colors from. The Palette draws samples from the colormap using the supplied sample_fn, which by default just draws linear samples but may be overriden with any function that draws samples in the supplied ranges. By slicing the Set1 colormap we draw colors only from the upper half of the palette and then reverse it."
"Palettes can customized by selecting only a subrange of the underlying cmap to draw the colors from. The Palette draws samples from the colormap using the supplied sample_fn, which by default just draws linear samples but may be overriden with any function that draws samples in the supplied ranges. By slicing the Set1 colormap we draw colors only from the upper half of the palette and then reverse it."
]
},
{
Expand Down
11 changes: 5 additions & 6 deletions doc/Tutorials/Sampling_Data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -274,7 +274,7 @@
"outputs": [],
"source": [
"raster = hv.Raster(np.random.rand(3, 3))\n",
"raster + hv.Points(raster.table().keys())[-1:3, -1:3] + raster.table()"
"raster + hv.Points(raster)[-1:3, -1:3] + raster.table()"
]
},
{
Expand Down Expand Up @@ -316,7 +316,7 @@
"source": [
"extents = (0, 0, 3, 3)\n",
"img = hv.Image(np.random.rand(3, 3), bounds=extents)\n",
"img + hv.Points(img.table().keys(), extents=extents) + img.table()"
"img + hv.Points(img, extents=extents) + img.table()"
]
},
{
Expand Down Expand Up @@ -462,7 +462,7 @@
"extents = (0, 0, 10, 10)\n",
"img = hv.Image(np.random.rand(10, 10), bounds=extents)\n",
"img_coords = hv.Points(img.table(), extents=extents)\n",
"img + img * img_coords * hv.Points([img.closest((5,5))])(style=dict(color='r')) + img.sample([(5, 5)])"
"img + img * img_coords * hv.Points([img.closest([(5,5)])])(style=dict(color='r')) + img.sample([(5, 5)])"
]
},
{
Expand All @@ -481,7 +481,7 @@
"outputs": [],
"source": [
"sampled = img.sample(y=5)\n",
"img + img * img_coords * hv.Points(zip(sampled.table().keys(), [img.closest((5,5))[1]]*10)) + sampled"
"img + img * img_coords * hv.Points(zip(sampled['x'], [img.closest(y=5)]*10)) + sampled"
]
},
{
Expand Down Expand Up @@ -610,8 +610,7 @@
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": false,
"scrolled": true
"collapsed": false
},
"outputs": [],
"source": [
Expand Down
2 changes: 1 addition & 1 deletion doc/Tutorials/Showcase.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -246,7 +246,7 @@
"source": [
"sample_pos = (0,0.25)\n",
"annotated = circular_wave * hv.Points([sample_pos])\n",
"sample = circular_wave.sample(samples=[sample_pos]).reindex().to.curve('Phase', 'Amplitude')\n",
"sample = circular_wave.sample(samples=[sample_pos]).to.curve('Phase', 'Amplitude', ['Frequency'])\n",
"annotated + sample"
]
},
Expand Down
2 changes: 1 addition & 1 deletion doc/reference_data
Submodule reference_data updated 76 files
+ holoviews_Elements_py2/data_029.pkl
+ holoviews_Elements_py2/data_030.pkl
+1 −1 holoviews_Elements_py2/display_022.html
+1 −1 holoviews_Elements_py2/display_030.html
+1 −1 holoviews_Elements_py2/display_031.html
+ holoviews_Elements_py3/data_029.pkl
+ holoviews_Elements_py3/data_030.pkl
+1 −1 holoviews_Elements_py3/display_022.html
+1 −1 holoviews_Elements_py3/display_030.html
+1 −1 holoviews_Elements_py3/display_031.html
+1 −1 holoviews_Exploring_Data_py2/display_002.html
+1 −1 holoviews_Exploring_Data_py2/display_010.html
+1 −1 holoviews_Exploring_Data_py3/display_002.html
+1 −1 holoviews_Exploring_Data_py3/display_010.html
+ holoviews_Pandas_Conversion_py2/data_008.pkl
+ holoviews_Pandas_Conversion_py2/data_009.pkl
+1 −1 holoviews_Pandas_Conversion_py2/display_001.html
+1 −1 holoviews_Pandas_Conversion_py2/display_002.html
+1 −1 holoviews_Pandas_Conversion_py2/display_003.html
+1 −1 holoviews_Pandas_Conversion_py2/display_004.html
+1 −1 holoviews_Pandas_Conversion_py2/display_005.html
+1 −1 holoviews_Pandas_Conversion_py2/display_006.html
+1 −1 holoviews_Pandas_Conversion_py2/display_007.html
+1 −1 holoviews_Pandas_Conversion_py2/display_008.html
+1 −1 holoviews_Pandas_Conversion_py2/display_009.html
+1 −1 holoviews_Pandas_Conversion_py2/display_010.html
+1 −1 holoviews_Pandas_Conversion_py2/display_011.html
+1 −1 holoviews_Pandas_Conversion_py2/display_012.html
+1 −1 holoviews_Pandas_Conversion_py2/display_013.html
+0 −1 holoviews_Pandas_Conversion_py2/display_014.html
+ holoviews_Pandas_Conversion_py3/data_008.pkl
+ holoviews_Pandas_Conversion_py3/data_009.pkl
+1 −1 holoviews_Pandas_Conversion_py3/display_001.html
+1 −1 holoviews_Pandas_Conversion_py3/display_002.html
+1 −1 holoviews_Pandas_Conversion_py3/display_003.html
+1 −1 holoviews_Pandas_Conversion_py3/display_004.html
+1 −1 holoviews_Pandas_Conversion_py3/display_005.html
+1 −1 holoviews_Pandas_Conversion_py3/display_006.html
+1 −1 holoviews_Pandas_Conversion_py3/display_007.html
+1 −1 holoviews_Pandas_Conversion_py3/display_008.html
+1 −1 holoviews_Pandas_Conversion_py3/display_009.html
+1 −1 holoviews_Pandas_Conversion_py3/display_010.html
+1 −1 holoviews_Pandas_Conversion_py3/display_011.html
+1 −1 holoviews_Pandas_Conversion_py3/display_012.html
+1 −1 holoviews_Pandas_Conversion_py3/display_013.html
+0 −1 holoviews_Pandas_Conversion_py3/display_014.html
+ holoviews_Sampling_Data_py2/data_007.pkl
+ holoviews_Sampling_Data_py2/data_008.pkl
+ holoviews_Sampling_Data_py2/data_011.pkl
+ holoviews_Sampling_Data_py2/data_013.pkl
+ holoviews_Sampling_Data_py2/data_014.pkl
+ holoviews_Sampling_Data_py2/data_015.pkl
+ holoviews_Sampling_Data_py2/data_016.pkl
+ holoviews_Sampling_Data_py2/data_017.pkl
+ holoviews_Sampling_Data_py2/data_018.pkl
+1 −1 holoviews_Sampling_Data_py2/display_006.html
+1 −1 holoviews_Sampling_Data_py2/display_009.html
+1 −1 holoviews_Sampling_Data_py2/display_011.html
+1 −1 holoviews_Sampling_Data_py2/display_014.html
+1 −1 holoviews_Sampling_Data_py2/display_015.html
+1 −1 holoviews_Sampling_Data_py2/display_016.html
+ holoviews_Sampling_Data_py3/data_007.pkl
+ holoviews_Sampling_Data_py3/data_008.pkl
+ holoviews_Sampling_Data_py3/data_011.pkl
+ holoviews_Sampling_Data_py3/data_013.pkl
+ holoviews_Sampling_Data_py3/data_014.pkl
+ holoviews_Sampling_Data_py3/data_015.pkl
+ holoviews_Sampling_Data_py3/data_016.pkl
+ holoviews_Sampling_Data_py3/data_017.pkl
+ holoviews_Sampling_Data_py3/data_018.pkl
+1 −1 holoviews_Sampling_Data_py3/display_006.html
+1 −1 holoviews_Sampling_Data_py3/display_009.html
+1 −1 holoviews_Sampling_Data_py3/display_011.html
+1 −1 holoviews_Sampling_Data_py3/display_014.html
+1 −1 holoviews_Sampling_Data_py3/display_015.html
+1 −1 holoviews_Sampling_Data_py3/display_016.html
1 change: 1 addition & 0 deletions holoviews/core/__init__.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
from .boundingregion import * # pyflakes:ignore (API import)
from .data import * # pyflakes:ignore (API import)
from .dimension import * # pyflakes:ignore (API import)
from .element import * # pyflakes:ignore (API import)
from .layout import * # pyflakes:ignore (API import)
Expand Down
Loading

0 comments on commit e91b018

Please sign in to comment.