Tensor 0.4.11

Tensor (n-dimensional array) library for F#

     Core features:
       - n-dimensional arrays (tensors) in host memory or on CUDA GPUs
       - element-wise operations (addition, multiplication, absolute value, etc.)
       - basic linear algebra operations (dot product, SVD decomposition, matrix inverse, etc.)
       - reduction operations (sum, product, average, maximum, arg max, etc.)
       - logic operations (comparision, and, or, etc.)
       - views, slicing, reshaping, broadcasting (similar to NumPy)
       - scatter and gather by indices
       - standard functional operations (map, fold, etc.)

     Data exchange:
       - read/write support for HDF5 (.h5)
       - interop with standard F# types (Seq, List, Array, Array2D, Array3D, etc.)

     Performance:
       - host: SIMD and BLAS accelerated operations
         - by default Intel MKL is used (shipped with NuGet package)
         - other BLASes (OpenBLAS, vendor-specific) can be selected by configuration option
       - CUDA GPU: all operations performed locally on GPU and cuBLAS used for matrix operations

     Requirements:
       - Linux, MacOS or Windows on x64
       - Linux requires libgomp.so.1 installed.

     Additional algorithms are provided in the Tensor.Algorithm package.

Install-Package Tensor -Version 0.4.11
dotnet add package Tensor --version 0.4.11
<PackageReference Include="Tensor" Version="0.4.11" />
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add Tensor --version 0.4.11
The NuGet Team does not provide support for this client. Please contact its maintainers for support.

GitHub repositories (0)

This package is not used by any popular GitHub repositories.

Version History

Version Downloads Last updated
0.4.11 1,308 5/8/2018
0.4.11-v0.4.11-215 219 5/8/2018
0.4.11-symtensor-core-242 228 11/15/2018
0.4.11-symtensor-core-241 218 11/15/2018
0.4.11-symtensor-core-240 213 11/15/2018
0.4.11-symtensor-core-239 195 11/15/2018
0.4.11-symtensor-core-238 219 11/15/2018
0.4.11-symtensor-core-237 240 11/15/2018
0.4.11-symtensor-core-236 197 11/14/2018
0.4.11-symtensor-core-235 198 11/14/2018
0.4.11-symtensor-core-234 192 11/14/2018
0.4.11-symtensor-core-231 233 11/9/2018
0.4.11-symtensor-core-230 227 11/9/2018
0.4.11-symtensor-core-229 199 11/8/2018
0.4.11-symtensor-core-228 195 11/8/2018
0.4.11-symtensor-core-227 234 10/30/2018
0.4.11-symtensor-core-226 230 10/30/2018
0.4.11-symtensor-core-225 203 10/30/2018
0.4.11-develop-216 328 5/8/2018
0.4.10-develop-213 329 5/8/2018
0.4.10-develop-212 322 5/7/2018
0.4.10-develop-211 320 5/7/2018
0.3.0.712-master 323 9/1/2017
0.3.0.711-master 323 9/1/2017
0.3.0.710-master 307 9/1/2017
0.3.0.709-master 297 8/31/2017
0.3.0.708-master 318 8/30/2017
0.3.0.707-master 327 8/30/2017
0.3.0.706-master 308 8/30/2017
0.3.0.701-master 347 6/26/2017
0.3.0.700-master 366 6/22/2017
0.3.0.699-master 340 6/22/2017
0.3.0.698-master 336 6/21/2017
0.3.0.697-master 336 6/21/2017
0.3.0.696-master 362 6/21/2017
0.3.0.695-master 336 6/21/2017
0.3.0.694-master 328 6/21/2017
0.3.0.693-master 342 6/20/2017
0.3.0.692-master 331 6/19/2017
0.3.0.691-master 359 6/19/2017
0.3.0.690-master 340 6/19/2017
0.3.0.689-master 340 5/14/2017
0.3.0.688 1,423 5/14/2017
0.3.0.686-master 344 5/14/2017
0.2.0.591-master 351 4/19/2017
0.2.0.565-master 361 4/11/2017
0.2.0.556-master 351 3/21/2017
0.2.0.551-master 403 3/17/2017
0.2.0.540-master 336 3/15/2017
0.2.0.536-master 332 3/14/2017
0.2.0.519-master 345 3/2/2017
0.2.0.516-master 337 3/2/2017
0.2.0.499-master 361 2/13/2017
0.2.0.494-master 342 2/7/2017
0.2.0.479-master 362 2/1/2017
0.2.0.463-master 360 1/17/2017
0.2.0.431-master 434 12/2/2016
0.2.0.422-master 370 11/9/2016
0.2.0.421-master 363 11/9/2016
0.2.0.411-master 414 10/26/2016
0.2.0.400-master 363 10/26/2016
0.2.0.394-master 381 10/25/2016
0.2.0.382-master 368 10/21/2016
0.2.0.377-master 360 10/20/2016
0.2.0.323-master 362 10/11/2016
0.2.0.262-master 376 9/29/2016
0.2.0.248-master 380 9/27/2016
0.2.0.174-master 377 9/16/2016
0.2.0.128-master 375 9/8/2016
0.2.0.122-master 379 9/8/2016
0.2.0.121-master 370 9/7/2016
0.2.0.111-master 364 9/7/2016
0.2.0.105-ci 406 9/5/2016
0.2.0.97-ci 395 8/30/2016
0.2.0.96-ci 372 8/29/2016
0.2.0.90-ci 372 8/25/2016
0.2.0.89-ci 362 8/24/2016
0.2.0.88-ci 369 8/24/2016
0.2.0.87-ci 384 8/24/2016
0.2.0.86-ci 368 8/23/2016
0.2.0.85-ci 368 8/22/2016
0.2.0.84-ci 380 8/22/2016
0.2.0.83-ci 380 8/22/2016
0.2.0.82 593 8/22/2016
0.2.0.81-ci 382 8/19/2016
0.2.0.80-ci 394 6/27/2016
0.2.0.79-ci 389 6/27/2016
0.2.0.77-ci 397 6/22/2016
0.2.0.76-ci 397 6/22/2016
0.2.0.75 454 6/15/2016
0.2.0.74-ci 384 6/15/2016
0.2.0.73 425 6/15/2016
0.2.0.72 437 6/15/2016
0.2.0.71 468 6/14/2016
0.2.0.70 424 6/9/2016
0.2.0.69 394 6/9/2016
0.2.0.68 422 6/9/2016
0.2.0.67 495 6/8/2016
0.2.0.66-ci 383 6/8/2016
0.2.0.65-ci 377 6/8/2016
0.2.0.64-ci 420 6/8/2016
0.2.0.63-ci 372 6/7/2016
0.2.0.62 423 6/7/2016
0.2.0.61 405 6/6/2016
0.2.0.60 404 6/6/2016
0.2.0.59 402 6/6/2016
0.2.0.57 425 6/3/2016
0.2.0.56 416 6/3/2016
0.2.0.55 452 6/3/2016
0.2.0.54 429 6/3/2016
0.2.0.53 462 6/3/2016
0.2.0.52-ci 379 6/2/2016
0.2.0.51-ci 383 6/2/2016
0.2.0.50-ci 390 6/2/2016
0.2.0.49 485 5/31/2016
0.2.0.48-ci 393 5/31/2016
0.2.0.46-ci 382 5/31/2016
0.2.0.45 422 5/31/2016
0.2.0.44 427 5/31/2016
0.2.0.43 441 5/31/2016
0.2.0.42 441 5/30/2016
0.2.0.41 435 5/30/2016
0.2.0.40 426 5/30/2016
0.2.0.39 442 5/30/2016
0.2.0.38 426 5/30/2016
0.2.0.37 423 5/30/2016
0.2.0.36 426 5/25/2016
0.2.0.35 447 5/24/2016
0.2.0.34 437 5/24/2016
0.2.0.33 581 5/24/2016
0.2.0.32-ci 374 5/24/2016
0.1.26-ci 397 5/24/2016
0.1.24-ci 388 5/24/2016
0.1.19-ci 374 5/24/2016