EFCore.BulkExtensions 7.1.2

There is a newer version of this package available.
See the version list below for details.
dotnet add package EFCore.BulkExtensions --version 7.1.2                
NuGet\Install-Package EFCore.BulkExtensions -Version 7.1.2                
This command is intended to be used within the Package Manager Console in Visual Studio, as it uses the NuGet module's version of Install-Package.
<PackageReference Include="EFCore.BulkExtensions" Version="7.1.2" />                
For projects that support PackageReference, copy this XML node into the project file to reference the package.
paket add EFCore.BulkExtensions --version 7.1.2                
#r "nuget: EFCore.BulkExtensions, 7.1.2"                
#r directive can be used in F# Interactive and Polyglot Notebooks. Copy this into the interactive tool or source code of the script to reference the package.
// Install EFCore.BulkExtensions as a Cake Addin
#addin nuget:?package=EFCore.BulkExtensions&version=7.1.2

// Install EFCore.BulkExtensions as a Cake Tool
#tool nuget:?package=EFCore.BulkExtensions&version=7.1.2                

EFCore.BulkExtensions

EntityFrameworkCore extensions: <br> -Bulk operations (very fast): Insert, Update, Delete, Read, Upsert, Sync, SaveChanges.<br> -Batch ops: Delete, Update - will be Deprecated since EF7 has native Execute-Up/Del; and Truncate.<br> Library is Lightweight and very Efficient, having all mostly used CRUD operation.<br> Was selected in top 20 EF Core Extensions recommended by Microsoft.<br> Latest version is using EF Core 7.<br> Supports all 4 mayor sql databases: SQLServer, PostgreSQL, MySQL, SQLite.<br> Check out Testimonials from the Community and User Comments.

License

*BulkExtensions licensed under Dual License v1.0 (solution to OpenSource funding, cFOSS: conditionallyFree OSS).<br> If you do not meet criteria for free usage with community license then you have to buy commercial one.<br> If eligible for free usage but still want to help development and have active support, consider purchasing Starter Lic.<br>

Support

If you find this project useful you can mark it by leaving a Github Star ⭐.</br> And even with community license you can make a DONATION:<br> "Buy Me A Coffee" _ or _ Button:zap:<br>

Contributing

Please read CONTRIBUTING for details on code of conduct, and the process for submitting pull requests.<br> When opening issues do write detailed explanation of the problem or feature with reproducible example.<br>

Description

Supported databases:<br> -SQLServer (or SqlAzure) under the hood uses SqlBulkCopy for Insert, Update/Delete = BulkInsert + raw Sql MERGE.<br> -PostgreSQL (9.5+) is using COPY BINARY combined with ON CONFLICT for Update.<br> -MySQL (8+) is using MySqlBulkCopy combined with ON DUPLICATE for Update.<br> -SQLite has no Copy tool, instead library uses plain SQL combined with UPSERT.<br> Bulk Tests can not have UseInMemoryDb because InMemoryProvider does not support Relational-specific methods.<br> Instead Test options are SqlServer(Developer or Express), LocalDb(if alongside Developer v.), or with other adapters.

Installation

Available on <a href="https://www.nuget.org/packages/EFCore.BulkExtensions/"><img src="https://buildstats.info/nuget/EFCore.BulkExtensions" /></a><br> That is main nuget for all Databases, there are also specific ones with single provider for those who need small packages.<br> Only single specific can be installed in a project, if need more then use main one with all providers.<br> Package manager console command for installation: Install-Package EFCore.BulkExtensions<br> Specific ones have adapter sufix: MainNuget + .SqlServer/PostgreSql/MySql/Sqlite<br> Its assembly is Strong-Named and Signed with a key. | Nuget | Target | Used EF v. | For projects targeting | | ----- | --------------- | ----------- | ------------------------------- | | 7.x | Net 6.0 | EF Core 7.0 | Net 7.0+ or 6.0+ | | 6.x | Net 6.0 | EF Core 6.0 | Net 6.0+ | | 5.x | NetStandard 2.1 | EF Core 5.0 | Net 5.0+ | | 3.x | NetStandard 2.0 | EF Core 3.n | NetCore(3.0+) or NetFrm(4.6.1+) MoreInfo| | 2.x | NetStandard 2.0 | EF Core 2.n | NetCore(2.0+) or NetFrm(4.6.1+) | | 1.x | NetStandard 1.4 | EF Core 1.0 | NetCore(1.0+) |

Supports follows official .Net lifecycle, currently v.7 as latest and v.6 as LTS.

Usage

It's pretty simple and straightforward.<br> Bulk Extensions are made on DbContext and are used with entities List (supported both regular and Async methods):

context.BulkInsert(entities);                 context.BulkInsertAsync(entities);
context.BulkInsertOrUpdate(entities);         context.BulkInsertOrUpdateAsync(entities);         // Upsert
context.BulkInsertOrUpdateOrDelete(entities); context.BulkInsertOrUpdateOrDeleteAsync(entities); // Sync
context.BulkUpdate(entities);                 context.BulkUpdateAsync(entities);
context.BulkDelete(entities);                 context.BulkDeleteAsync(entities);
context.BulkRead(entities);                   context.BulkReadAsync(entities);
context.BulkSaveChanges();                    context.BulkSaveChangesAsync();

-SQLite requires package: SQLitePCLRaw.bundle_e_sqlite3 with call to SQLitePCL.Batteries.Init()<br> -MySQL when running its Test for the first time execute sql command (local-data): SET GLOBAL local_infile = true;

Batch Extensions are made on IQueryable DbSet and can be used as in the following code segment.<br> They are done as pure sql and no check is done whether some are prior loaded in memory and are being Tracked.<br> (updateColumns is optional param in which PropertyNames added explicitly when need update to it's default value)<br> Info about lock-escalation in SQL Server with Batch iteration example as a solution at the bottom of code segment.

// Delete
context.Items.Where(a => a.ItemId >  500).BatchDelete();
context.Items.Where(a => a.ItemId >  500).BatchDeleteAsync();

// Update (using Expression arg.) supports Increment/Decrement 
context.Items.Where(a => a.ItemId <= 500).BatchUpdate(a => new Item { Quantity = a.Quantity + 100 });
context.Items.Where(a => a.ItemId <= 500).BatchUpdateAsync(a => new Item { Quantity = a.Quantity + 100});
  // can be as value '+100' or as variable '+incrementStep' (int incrementStep = 100;)
  
// Update (via simple object)
context.Items.Where(a => a.ItemId <= 500).BatchUpdate(new Item { Description = "Updated" });
context.Items.Where(a => a.ItemId <= 500).BatchUpdateAsync(new Item { Description = "Updated" });
// Update (via simple object) - requires additional Argument for setting to Property default value
var updateCols = new List<string> { nameof(Item.Quantity) }; // Update 'Quantity' to default value ('0')
var q = context.Items.Where(a => a.ItemId <= 500);
int affected = q.BatchUpdate(new Item { Description="Updated" }, updateCols); // result assigned to aff.

// Batch iteration (useful in same cases to avoid lock escalation)
do {
    rowsAffected = query.Take(chunkSize).BatchDelete();
} while (rowsAffected >= chunkSize);

// Truncate
context.Truncate<Entity>();
context.TruncateAsync<Entity>();

Performances

Following are performances (in seconds)

  • For SQL Server (v. 2019):
Ops\Rows EF 100K Bulk 100K EF 1 MIL. Bulk 1 MIL.
Insert 11 s 3 s 60 s 15 s
Update 8 s 4 s 84 s 27 s
Delete 50 s 3 s 5340 s 15 s

TestTable has 6 columns (Guid, string x2, int, decimal?, DateTime), all inserted and 2 were updated.<br> Test done locally on configuration: INTEL i7-10510U CPU 2.30GHz, DDR3 16 GB, SSD SAMSUNG 512 GB.<br> For small data sets there is an overhead since most Bulk ops need to create Temp table and also Drop it after finish.<br> Probably good advice would be to use Bulk ops for sets greater than 1000.

Bulk info

If Windows Authentication is used then in ConnectionString there should be Trusted_Connection=True; because Sql credentials are required to stay in connection.<br>

When used directly each of these operations are separate transactions and are automatically committed.<br> And if we need multiple operations in single procedure then explicit transaction should be used, for example:<br>

using (var transaction = context.Database.BeginTransaction())
{
    context.BulkInsert(entities1List);
    context.BulkInsert(entities2List);
    transaction.Commit();
}

BulkInsertOrUpdate method can be used when there is need for both operations but in one connection to database.<br> It makes Update when PK(PrimaryKey) is matched, otherwise does Insert.<br>

BulkInsertOrUpdateOrDelete effectively synchronizes table rows with input data.<br> Those in Db that are not found in the list will be deleted.<br> Partial Sync can be done on table subset using expression set on config with method:<br> bulkConfig.SetSynchronizeFilter<Item>(a => a.Quantity > 0);<br> Not supported for SQLite (Lite has only UPSERT statement) nor currently for PostgreSQL. Way to achieve there sync functionality is to Select or BulkRead existing data from DB, split list into sublists and call separately Bulk methods for BulkInsertOrUpdate and Delete.

BulkRead (SELECT and JOIN done in Sql)<br> Used when need to Select from big List based on Unique Prop./Columns specified in config UpdateByProperties<br>

// instead of WhereIN which will TimeOut for List with over around 40 K records
var entities = context.Items.Where(a => itemsNames.Contains(a.Name)).AsNoTracking().ToList(); // SQL IN
// or JOIN in Memory that loads entire table
var entities = context.Items.Join(itemsNames, a => a.Name, p => p, (a, p) => a).AsNoTracking().ToList();

// USE
var items = itemsNames.Select(a => new Item { Name = a }).ToList(); // Items list with only Name set
var bulkConfig = new BulkConfig { UpdateByProperties = new List<string> { nameof(Item.Name) } };
context.BulkRead(items, bulkConfig); // Items list will be loaded from Db with data(other properties)

Useful config ReplaceReadEntities that works as Contains/IN and returns all which match the criteria (not unique).<br> Example of special use case when need to BulkRead child entities after BulkReading parent list.

SaveChanges uses Change Tracker to find all modified(CUD) entities and call proper BulkOperations for each table.<br> Because it needs tracking it is slower then pure BulkOps but still much faster then regular SaveChanges.<br> With config OnSaveChangesSetFK setting FKs can be controlled depending on whether PKs are generated in Db or in memory.<br> Support for this method was added in version 6 of the library.<br> Before calling this method newly created should be added into Range:

context.Items.AddRange(newEntities); // if newEntities is parent list it can have child sublists
context.BulkSaveChanges();

Practical general usage could be made in a way to override regular SaveChanges and if any list of Modified entities entries is greater then say 1000 to redirect to Bulk version.

Note: Bulk ops have optional argument Type type that can be set to type of Entity if list has dynamic runtime objects or is inherited from Entity class.

BulkConfig arguments

Bulk methods can have optional argument BulkConfig with properties (bool, int, object, List<string>):<br>

PROPERTY : DEFAULTvalue
----------------------------------------------------------------------------------------------
PreserveInsertOrder: true,                    PropertiesToInclude: null,
SetOutputIdentity: false,                     PropertiesToIncludeOnCompare: null,
BatchSize: 2000,                              PropertiesToIncludeOnUpdate: null,
NotifyAfter: null,                            PropertiesToExclude: null,
BulkCopyTimeout: null,                        PropertiesToExcludeOnCompare: null,
EnableStreaming: false,                       PropertiesToExcludeOnUpdate: null,
UseTempDB: false,                             UpdateByProperties: null,
UniqueTableNameTempDb: true,                  EnableShadowProperties: false,
CustomDestinationTableName: null,             IncludeGraph: false,
CustomSourceTableName: null,                  OmitClauseExistsExcept: false,
CustomSourceDestinationMappingColumns: null,  DoNotUpdateIfTimeStampChanged: false,
TrackingEntities: false,                      SRID: 4326,
WithHoldlock: true,                           DateTime2PrecisionForceRound: false,
CalculateStats: false,                        TemporalColumns: { "PeriodStart", "PeriodEnd" },
SqlBulkCopyOptions: Default,                  OnSaveChangesSetFK: true,
SqlBulkCopyColumnOrderHints: null,            IgnoreGlobalQueryFilters: false,
OnConflictUpdateWhereSql: null,               ReplaceReadEntities: false,
----------------------------------------------------------------------------------------------
METHOD: SetSynchronizeFilter<T>
        SetSynchronizeSoftDelete<T>

If we want to change defaults, BulkConfig should be added explicitly with one or more bool properties set to true, and/or int props like BatchSize to different number.<br> Config also has DelegateFunc for setting Underlying-Connection/Transaction, e.g. in UnderlyingTest.<br> When doing update we can chose to exclude one or more properties by adding their names into PropertiesToExclude, or if we need to update less then half column then PropertiesToInclude can be used. Setting both Lists are not allowed.

When using the BulkInsert_/OrUpdate methods, you may also specify the PropertiesToIncludeOnCompare and PropertiesToExcludeOnCompare properties (only for SqlServer). By adding a column name to the PropertiesToExcludeOnCompare, will allow it to be inserted and updated but will not update the row if any of the other columns in that row did not change. For example, if you are importing bulk data and want to remove from comparison an internal CreateDate or UpdateDate, you add those columns to the PropertiesToExcludeOnCompare.<br> Another option that may be used in the same scenario are the PropertiesToIncludeOnUpdate and PropertiesToExcludeOnUpdate properties. These properties will allow you to specify insert-only columns such as CreateDate and CreatedBy.

If we want Insert only new and skip existing ones in Db (Insert_if_not_Exist) then use BulkInsertOrUpdate with config PropertiesToIncludeOnUpdate = new List<string> { "" }

Additionally there is UpdateByProperties for specifying custom properties, by which we want update to be done.<br> When setting multiple props in UpdateByProps then match done by columns combined, like unique constrain based on those cols.<br> Using UpdateByProperties while also having Identity column requires that Id property be Excluded.<br> Also with PostgreSQL when matching is done it requires UniqueIndex so for custom UpdateByProperties that do not have Un.Ind., it is temporarily created in which case method can not be in transaction (throws: current transaction is aborted; CREATE INDEX CONCURRENTLY cannot run inside a transaction block).<br> Similar is done with MySQL by temporarily adding UNIQUE CONSTRAINT.<br>

If NotifyAfter is not set it will have same value as BatchSize while BulkCopyTimeout when not set has SqlBulkCopy default which is 30 seconds and if set to 0 it indicates no limit.<br><br> SetOutputIdentity have purpose only when PK has Identity (usually int type with AutoIncrement), while if PK is Guid(sequential) created in Application there is no need for them.<br> Also Tables with Composite Keys have no Identity column so no functionality for them in that case either.

var bulkConfig = new BulkConfig { SetOutputIdentity = true, BatchSize = 4000 };
context.BulkInsert(entities, bulkConfig);
context.BulkInsertOrUpdate(entities, new BulkConfig { SetOutputIdentity = true });
context.BulkInsertOrUpdate(entities, b => b.SetOutputIdentity = true); // e.g. BulkConfig with Action arg.

PreserveInsertOrder is true by default and makes sure that entities are inserted to Db as ordered in entitiesList.<br> When table has Identity column (int autoincrement) with 0 values in list they will temporary be automatically changed from 0s into range -N:-1.<br> Or it can be manually set with proper values for order (Negative values used to skip conflict with existing ones in Db).<br> Here single Id value itself doesn't matter, db will change it to next in sequence, what matters is their mutual relationship for sorting.<br> Insertion order is implemented with TOP in conjunction with ORDER BY. stackoverflow:merge-into-insertion-order.<br> This config should remain true when SetOutputIdentity is set to true on Entity containing NotMapped Property. issues/76<br> When using SetOutputIdentity Id values will be updated to new ones from database.<br> With BulkInsertOrUpdate on SQLServer for those that will be updated it has to match with Id column, or other unique column(s) if using UpdateByProperties in which case orderBy is done with those props instead of ID, due to how Sql MERGE works. To preserve insert order by Id in this case alternative would be first to use BulkRead and find which records already exist, then split the list into 2 lists entitiesForUpdate and entitiesForInsert without configuring UpdateByProps).<br> Also for SQLite combination of BulkInsertOrUpdate and IdentityId automatic set will not work properly since it does not have full MERGE capabilities like SqlServer. Instead list can be split into 2 lists, and call separately BulkInsert and BulkUpdate.<br>

SetOutputIdentity is useful when BulkInsert is done to multiple related tables, that have Identity column.<br> After Insert is done to first table, we need Id-s (if using Option 1) that were generated in Db because they are FK(ForeignKey) in second table.<br> It is implemented with OUTPUT as part of MERGE Query, so in this case even the Insert is not done directly to TargetTable but to TempTable and then Merged with TargetTable.<br> When used Id-s will be updated in entitiesList, and if PreserveInsertOrder is set to false then entitiesList will be cleared and reloaded.<br> Example of SetOutputIdentity with parent-child FK related tables:

int numberOfEntites = 1000;
var entities = new List<Item>();
var subEntities = new List<ItemHistory>();
for (int i = 1; i <= numberOfEntites; i++)
{
    var entity = new Item { Name = $"Name {i}" };
    entity.ItemHistories = new List<ItemHistory>()
    {
        new ItemHistory { Remark = $"Info {i}.1" },
        new ItemHistory { Remark = $"Info {i}.2" }
    };
    entities.Add(entity);
}

// Option 1 (recommended)
using (var transaction = context.Database.BeginTransaction())
{
    context.BulkInsert(entities, new BulkConfig { SetOutputIdentity = true });
    foreach (var entity in entities) {
        foreach (var subEntity in entity.ItemHistories) {
            subEntity.ItemId = entity.ItemId; // sets FK to match its linked PK that was generated in DB
        }
        subEntities.AddRange(entity.ItemHistories);
    }
    context.BulkInsert(subEntities);
    transaction.Commit();
}

// Option 2 using Graph (only for SQL Server)
// - all entities in relationship with main ones in list are BulkInsertUpdated
context.BulkInsert(entities, b => b.IncludeGraph = true);
  
// Option 3 with BulkSaveChanges() - uses ChangeTracker so little slower then direct Bulk
context.Items.AddRange(entities);
context.BulkSaveChanges();

When CalculateStats set to True the result returned in BulkConfig.StatsInfo (StatsNumber-Inserted/Updated/Deleted).<br> If used for pure Insert (with Batching) then SetOutputIdentity should also be configured because Merge is required.<br> TrackingEntities can be set to True if we want to have tracking of entities from BulkRead or if SetOutputIdentity is set.<br> WithHoldlock means Serializable isolation level that locks the table (can have negative effect on concurrency).<br> _ Setting it False can optionally be used to solve deadlock issue Insert.<br> UseTempDB when set then BulkOperation has to be inside Transaction.<br> UniqueTableNameTempDb when changed to false temp table name will be only 'Temp' without random numbers.<br> CustomDestinationTableName can be set with 'TableName' only or with 'Schema.TableName'.<br> CustomSourceTableName when set enables source data from specified table already in Db, so input list not used and can be empty.<br> CustomSourceDestinationMappingColumns dict can be set only if CustomSourceTableName is configured and it is used for specifying Source-Destination column names when they are not the same. Example in test DestinationAndSourceTableNameTest.<br> EnableShadowProperties to add (normal) Shadow Property and persist value. Disables automatic discriminator, use manual method.<br> IncludeGraph when set all entities that have relations with main ones from the list are also merged into theirs tables.<br> OmitClauseExistsExcept removes the clause from Merge statement, required when having noncomparable types like XML, and useful when need to active triggers even for same data.<br> _ Also in some sql collation, small and capital letters are considered same (case-insensitive) so for BulkUpdate set it false.<br> DoNotUpdateIfTimeStampChanged if set checks TimeStamp for Concurrency, ones with conflict will not be updated.<br> SRID Spatial Reference Identifier - for SQL Server with NetTopologySuite.<br> DateTime2PrecisionForceRound If dbtype datetime2 has precision less then default 7, example 'datetime2(3)' SqlBulkCopy does Floor instead of Round so when this Property is set then Rounding will be done in memory to make sure inserted values are same as with regular SaveChanges.<br> TemporalColumns are shadow columns used for Temporal table. Default elements 'PeriodStart' and 'PeriodEnd' can be changed if those columns have custom names.<br> OnSaveChangesSetFK is used only for BulkSaveChanges. When multiply entries have FK relationship which is Db generated, this set proper value after reading parent PK from Db. IF PK are generated in memory like are some Guid then this can be set to false for better efficiency.<br> ReplaceReadEntities when set to True result of BulkRead operation will be provided using replace instead of update. Entities list parameter of BulkRead method will be repopulated with obtained data. Enables functionality of Contains/IN which will return all entities matching the criteria (does not have to be by unique columns).

SqlBulkCopyOptions is Enum (only for SqlServer) with [Flags] attribute which enables specifying one or more options:<br> Default, KeepIdentity, CheckConstraints, TableLock, KeepNulls, FireTriggers, UseInternalTransaction<br> If need to set Identity PK in memory, Not let DB do the autoincrement, then need to use KeepIdentity:<br> var bulkConfig = new BulkConfig { SqlBulkCopyOptions = SqlBulkCopyOptions.KeepIdentity };<br> Useful for example when copying from one Db to another.

OnConflictUpdateWhereSql<T> To define conditional updates on merges, receives (existingTable, insertedTable).<br> --Example: bc.OnConflictUpdateWhereSql = (ex, in) => $"{in}.TimeUpdated > {ex}.TimeUpdated";<br> SetSynchronizeFilter<T> A method that receives and sets expresion filter on entities to delete when using BulkInsertOrUpdateOrDelete. Those that are filterd out will be ignored and not deleted.<br> SetSynchronizeSoftDelete<T> A method that receives and sets expresion on entities to update property instead od deleting when using BulkInsertOrUpdateOrDelete.<br> bulkConfig.SetSynchronizeSoftDelete<SomeObject>(a => new SomeObject { IsDeleted = true });<br>

Last optional argument is Action progress (Example in EfOperationTest.cs RunInsert() with WriteProgress()).

context.BulkInsert(entitiesList, null, (a) => WriteProgress(a));

Library supports Global Query Filters and Value Conversions as well.</br> Additionally BatchUpdate and named Property works with EnumToString Conversion.</br> It can map OwnedTypes, also next are links with info how to achieve NestedOwnedTypes and OwnedInSeparateTable.</br> On PG when Enum is in OwnedType it needs to have Converter explicitly configured in OnModelCreating.</br>

Table splitting are somewhat specific but could be configured in way Set TableSplit.</br> With Computed and Timestamp Columns it will work in a way that they are automatically excluded from Insert. And when combined with SetOutputIdentity they will be Selected.<br> Spatial types, like Geometry, also supported and if Entity has one, clause EXIST ... EXCEPT is skipped because it's not comparable.<br> Performance for bulk ops measured with ActivitySources named: 'BulkExecute' (tags: 'operationType', 'entitiesCount')<br> Bulk Extension methods can be Overridden if required, for example to set AuditInfo.<br> If having problems with Deadlock there is useful info in issue/46.

TPH (Table-Per-Hierarchy) inheritance model can can be set in 2 ways.<br> First is automatically by Convention in which case Discriminator column is not directly in Entity but is Shadow Property.<br> And second is to explicitly define Discriminator property in Entity and configure it with .HasDiscriminator().<br> Important remark regarding the first case is that since we can not set directly Discriminator to certain value we need first to add list of entities to DbSet where it will be set and after that we can call Bulk operation. Note that SaveChanges are not called and we could optionally turn off TrackingChanges for performance. Example:

public class Student : Person { ... }
context.Students.AddRange(entities); // adding to Context so that Shadow property 'Discriminator' gets set
context.BulkInsert(entities);

TPT (Table-Per-Type) way it is supported.

Product Compatible and additional computed target framework versions.
.NET net6.0 is compatible.  net6.0-android was computed.  net6.0-ios was computed.  net6.0-maccatalyst was computed.  net6.0-macos was computed.  net6.0-tvos was computed.  net6.0-windows was computed.  net7.0 was computed.  net7.0-android was computed.  net7.0-ios was computed.  net7.0-maccatalyst was computed.  net7.0-macos was computed.  net7.0-tvos was computed.  net7.0-windows was computed.  net8.0 was computed.  net8.0-android was computed.  net8.0-browser was computed.  net8.0-ios was computed.  net8.0-maccatalyst was computed.  net8.0-macos was computed.  net8.0-tvos was computed.  net8.0-windows was computed. 
Compatible target framework(s)
Included target framework(s) (in package)
Learn more about Target Frameworks and .NET Standard.

NuGet packages (143)

Showing the top 5 NuGet packages that depend on EFCore.BulkExtensions:

Package Downloads
Elsa.Persistence.EntityFramework.Core

Elsa is a set of workflow libraries and tools that enable lean and mean workflowing capabilities in any .NET Core application. This package provides Entity Framework Core entities used by the various Elsa persistence EF Core providers.

CyberEye.Common.Lib

Package chứa các hàm tiện ích và common

GreatUtilities.Core

Essencial tools to agile development.

Ssg.Core

Ssg.Core Is Core of Framework fro web application

Adriva.Extensions.Analytics

Adriva Analytics Server Extensions

GitHub repositories (11)

Showing the top 5 popular GitHub repositories that depend on EFCore.BulkExtensions:

Repository Stars
cq-panda/Vue.NetCore
(已支持sqlsugar).NetCore、.Net6、Vue2、Vue3、Vite、TypeScript、Element plus+uniapp前后端分离,全自动生成代码;支持移动端(ios/android/h5/微信小程序。http://www.volcore.xyz/
Webreaper/Damselfly
Damselfly is a server-based Photograph Management app. The goal of Damselfly is to index an extremely large collection of images, and allow easy search and retrieval of those images, using metadata such as the IPTC keyword tags, as well as the folder and file names. Damselfly includes support for object/face detection.
dotnetcore/sharding-core
high performance lightweight solution for efcore sharding table and sharding database support read-write-separation .一款ef-core下高性能、轻量级针对分表分库读写分离的解决方案,具有零依赖、零学习成本、零业务代码入侵
WolvenKit/WolvenKit
Community Mod editor/creator for REDengine games.
VahidN/EFCoreSecondLevelCacheInterceptor
EF Core Second Level Cache Interceptor
Version Downloads Last updated
9.0.0-rc.1 293 12/1/2024
8.1.2 28,604 11/20/2024
8.1.1 479,069 9/6/2024
8.1.0 325,095 7/30/2024
8.0.4 880,656 5/23/2024
8.0.3 287,856 4/29/2024
8.0.2 749,974 2/19/2024
8.0.1 597,950 12/14/2023
8.0.0 292,512 11/21/2023
8.0.0-rc.1.2 28,805 10/4/2023
8.0.0-rc.1 1,412 9/13/2023
8.0.0-preview.7 628 8/31/2023
7.8.1 218,906 12/14/2023
7.1.6 928,094 8/29/2023
7.1.5 221,608 7/25/2023
7.1.4 156,637 7/10/2023
7.1.3 83,179 7/3/2023
7.1.2 330,242 5/26/2023
7.1.1 114,158 5/13/2023
7.1.0 167,071 4/26/2023
7.0.4 126,885 4/19/2023
7.0.3 198,484 4/13/2023
7.0.2 4,629 4/13/2023
7.0.1 698,159 1/28/2023
7.0.0 99,380 1/22/2023
6.8.1 362,047 12/18/2023
6.7.16 220,077 8/29/2023
6.7.15 437,176 7/25/2023
6.7.14 39,092 7/10/2023
6.7.13 8,114 7/4/2023
6.7.12 135,948 5/26/2023
6.7.11 38,417 5/13/2023
6.7.1 81,393 4/26/2023
6.7.0 438,482 1/22/2023
6.6.5 499,929 1/5/2023
6.6.4 116,982 12/19/2022
6.6.3 3,904 12/19/2022
6.6.2 272,318 12/7/2022
6.6.1 3,059 12/7/2022
6.6.0 4,163 12/7/2022
6.6.0-rc.2 155 12/7/2022
6.6.0-rc.1 157 12/7/2022
6.5.6 3,822,585 8/8/2022
6.5.5 406,104 7/21/2022
6.5.4 246,992 7/12/2022
6.5.3 69,964 7/7/2022
6.5.2 280,311 6/20/2022
6.5.1 168,107 6/14/2022
6.5.0 611,647 5/10/2022
6.4.4 956,944 4/15/2022
6.4.3 7,118 4/14/2022
6.4.2 820,735 3/17/2022
6.4.1 720,605 2/21/2022
6.4.0 281,333 2/8/2022
6.3.9 166,433 2/7/2022
6.3.8 7,609 2/6/2022
6.3.7 20,663 2/3/2022
6.3.6 3,797 2/3/2022
6.3.5 3,281 2/3/2022
6.3.4 16,692 2/2/2022
6.3.3 86,105 1/31/2022
6.3.2 34,657 1/28/2022
6.3.1 103,772 1/20/2022
6.3.0 102,099 1/15/2022
6.2.9 794,110 1/14/2022
6.2.8 70,661 1/9/2022
6.2.7 9,075 1/9/2022
6.2.6 974,810 1/1/2022
6.2.5 3,241 12/30/2021
6.2.4 21,048 12/25/2021
6.2.3 153,263 12/17/2021
6.2.2 19,298 12/15/2021
6.2.1 32,857 12/13/2021
6.2.0 23,468 12/10/2021
6.1.9 19,625 12/9/2021
6.1.8 10,699 12/9/2021
6.1.7 2,972 12/9/2021
6.1.6 8,589 12/8/2021
6.1.5 8,051 12/8/2021
6.1.4 244,861 12/4/2021
6.1.3 8,977 12/3/2021
6.1.2 10,974 12/2/2021
6.1.1 65,211 11/29/2021
6.1.0 125,833 11/28/2021
6.0.9 48,319 11/26/2021
6.0.8 29,630 11/26/2021
6.0.7 52,291 11/26/2021
6.0.6 76,557 11/24/2021
6.0.5 22,667 11/24/2021
6.0.4 62,243 11/21/2021
6.0.3 64,316 11/18/2021
6.0.2 54,905 11/12/2021
6.0.1 143,184 11/10/2021
6.0.0 313,611 11/10/2021
6.0.0-rc.2 3,278 10/15/2021
6.0.0-rc.1 714 10/6/2021
5.4.2 540,244 1/14/2022
5.4.1 206,465 11/12/2021
5.4.0 875,776 9/9/2021
5.3.9 45,600 9/5/2021
5.3.8 15,525 9/2/2021
5.3.7 239,934 8/10/2021
5.3.6 8,094 8/10/2021
5.3.5 15,364 8/9/2021
5.3.4 4,131 8/9/2021
5.3.3 6,382 8/9/2021
5.3.2 11,277 8/6/2021
5.3.1 88,317 7/26/2021
5.3.0 68,949 7/19/2021
5.2.9 17,799 7/19/2021
5.2.8 101,063 7/9/2021
5.2.7 25,952 7/8/2021
5.2.6 39,042 7/5/2021
5.2.5 92,414 6/20/2021
5.2.4 14,157 6/17/2021
5.2.3 53,644 6/10/2021
5.2.2 271,587 5/19/2021
5.2.1 24,461 5/17/2021
5.2.0 28,669 5/13/2021
5.1.9 6,293 5/13/2021
5.1.8 63,510 5/9/2021
5.1.7 25,804 5/5/2021
5.1.6 8,409 5/4/2021
5.1.5 5,582 5/3/2021
5.1.4 5,005 5/2/2021
5.1.3 3,751 5/1/2021
5.1.2 51,134 4/24/2021
5.1.1 4,026 4/23/2021
5.1.0 43,731 4/20/2021
5.0.9 46,981 4/19/2021
5.0.8 15,084 4/19/2021
5.0.7 46,054 4/12/2021
5.0.6 18,838 4/8/2021
5.0.5 30,384 4/7/2021
5.0.4 85,616 4/7/2021
5.0.3 10,595 4/7/2021
5.0.2 12,925 4/4/2021
5.0.1 11,024 4/3/2021
5.0.0 25,440 4/2/2021
3.6.6 333,959 2/26/2022
3.6.5 122,677 1/14/2022
3.6.4 6,953 1/7/2022
3.6.3 413,291 8/5/2021
3.6.2 22,912 7/26/2021
3.6.1 343,820 4/7/2021
3.6.0 12,498 4/7/2021
3.5.8 120,274 3/30/2021
3.5.7 3,465 3/30/2021
3.5.6 31,961 3/29/2021
3.5.5 12,888 3/27/2021
3.5.4 4,490 3/26/2021
3.5.3 2,893 3/26/2021
3.5.2 33,252 3/25/2021
3.5.1 10,360 3/24/2021
3.5.0 7,446 3/24/2021
3.4.9 15,735 3/23/2021
3.4.8 71,703 3/22/2021
3.4.7 12,263 3/21/2021
3.4.6 6,050 3/20/2021
3.4.5 2,903 3/20/2021
3.4.4 3,378 3/19/2021
3.4.3 13,844 3/18/2021
3.4.2 10,222 3/17/2021
3.4.1 11,121 3/17/2021
3.4.0 42,182 3/15/2021
3.3.9 172,472 3/15/2021
3.3.8 9,659 3/14/2021
3.3.7 6,800 3/13/2021
3.3.6 14,078 3/13/2021
3.3.5 161,979 3/10/2021
3.3.4 35,244 3/9/2021
3.3.3 104,835 3/8/2021
3.3.2 7,724 3/7/2021
3.3.1 440,388 2/7/2021
3.3.0 19,490 2/7/2021
3.2.7 529,234 12/13/2020
3.2.6 6,328 12/12/2020
3.2.5 1,173,812 10/15/2020
3.2.4 205,714 10/6/2020
3.2.3 199,921 9/23/2020
3.2.2 21,509 9/21/2020
3.2.1 8,619 9/21/2020
3.2.0 29,966 9/17/2020
3.1.6 341,886 9/11/2020
3.1.5 1,035,871 7/14/2020
3.1.4 205,285 7/7/2020
3.1.3 33,151 7/3/2020
3.1.2 13,610 7/1/2020
3.1.1 1,146,540 3/25/2020
3.1.0 1,309,004 12/18/2019
3.0.5 95,728 12/10/2019
3.0.4 68,790 12/1/2019
3.0.3 31,880 11/17/2019
3.0.2 10,374 11/12/2019
3.0.1 6,796 11/11/2019
3.0.0 178,083 10/4/2019
3.0.0-rc 5,535 9/25/2019
2.6.4 915,338 11/30/2019
2.6.3 419,075 9/21/2019
2.6.2 3,656 9/21/2019
2.6.1 120,665 9/12/2019
2.6.0 259,203 8/19/2019
2.6.0-rc 5,221 7/24/2019
2.5.2 198,416 7/22/2019
2.5.1 25,339 7/14/2019
2.5.0 119,506 7/14/2019
2.4.9 128,590 7/4/2019
2.4.8 3,423 7/4/2019
2.4.7 229,775 5/28/2019
2.4.6 126,647 4/22/2019
2.4.5 66,267 4/8/2019
2.4.4 96,619 3/18/2019
2.4.3 36,179 3/5/2019
2.4.2 9,265 3/3/2019
2.4.1 351,851 3/3/2019
2.4.0 225,738 2/4/2019
2.3.9 88,134 1/31/2019
2.3.8 19,787 1/29/2019
2.3.7 101,746 1/4/2019
2.3.6 19,350 12/27/2018
2.3.5 71,660 12/10/2018
2.3.4 21,812 11/27/2018
2.3.3 3,784 11/27/2018
2.3.2 13,246 11/26/2018
2.3.1 14,305 11/25/2018
2.3.0 13,843 11/23/2018
2.2.9 13,711 11/23/2018
2.2.8 3,997 11/22/2018
2.2.7 4,211 11/22/2018
2.2.6 72,634 11/21/2018
2.2.5 8,420 11/16/2018
2.2.4 38,102 11/14/2018
2.2.3 16,392 11/11/2018
2.2.2 67,654 11/8/2018
2.2.1 8,043 11/8/2018
2.2.0 141,194 11/8/2018
2.1.9 59,152 10/28/2018
2.1.8 37,445 10/10/2018
2.1.7 140,082 7/26/2018
2.1.6 13,967 7/13/2018
2.1.5 6,665 7/12/2018
2.1.4 14,527 7/7/2018
2.1.3 11,792 6/24/2018
2.1.2 11,740 6/21/2018
2.1.1 38,882 6/14/2018
2.1.0 23,658 6/11/2018
2.0.9 10,479 6/11/2018
2.0.8 133,024 5/15/2018
2.0.7 34,545 3/28/2018
2.0.6 7,865 3/24/2018
2.0.5 18,400 2/12/2018
2.0.4 4,439 2/6/2018
2.0.3 5,160 1/30/2018
2.0.2 14,851 11/13/2017
2.0.1 13,265 9/7/2017
2.0.0 19,552 9/4/2017
2.0.0-rc 3,663 9/4/2017
1.1.0 6,071 9/4/2017
1.0.8 4,229 8/31/2017
1.0.7 5,089 8/15/2017
1.0.6 4,403 8/9/2017
1.0.5 5,187 7/11/2017
1.0.4 4,694 6/23/2017
1.0.3 4,611 5/30/2017
1.0.2 4,940 5/15/2017
1.0.1 4,312 5/12/2017
1.0.0 17,747 5/12/2017

Multiple fixes