Back to Blog
Oracle.NETDatabasePerformanceEnterprise

Oracle Database Performance: Connection Pooling and Multi-User Strategies in .NET

Umut Korkmaz2025-05-209 min read

Oracle Database in a .NET stack isn't the trendiest combination in 2025, but in enterprise banking, it's the reality. Our system serves thousands of concurrent users across multiple roles — customers, tellers, branch managers, admins — each with different data access patterns and security requirements. Here's how I architected our database layer for performance and security using role-based connection strings, Dapper for critical paths, and connection pooling strategies that actually work.

The Multi-User Connection Strategy

Most tutorials show you one connection string in appsettings.json. We have four — one per role tier:

json
{
  "ConnectionStrings": {
    "OracleCustomer": "User Id=APP_CUSTOMER;Password=***;Data Source=BANKDB;Min Pool Size=20;Max Pool Size=100;Connection Timeout=15;",
    "OracleTeller": "User Id=APP_TELLER;Password=***;Data Source=BANKDB;Min Pool Size=10;Max Pool Size=50;Connection Timeout=15;",
    "OracleManager": "User Id=APP_MANAGER;Password=***;Data Source=BANKDB;Min Pool Size=5;Max Pool Size=20;Connection Timeout=15;",
    "OracleAdmin": "User Id=APP_ADMIN;Password=***;Data Source=BANKDB;Min Pool Size=2;Max Pool Size=10;Connection Timeout=15;"
  }
}

Each Oracle user has different grants. APP_CUSTOMER can only SELECT from views and EXECUTE specific packages. APP_TELLER gets INSERT/UPDATE on transaction tables. This is defense in depth — even if our application layer is compromised, the database user's privileges limit the blast radius.

The Connection Factory

csharp
public interface IDbConnectionFactory
{
    IDbConnection CreateConnection(UserRole role);
}

public class OracleConnectionFactory : IDbConnectionFactory
{
    private readonly IConfiguration _configuration;
    private readonly ILogger<OracleConnectionFactory> _logger;

    public OracleConnectionFactory(
        IConfiguration configuration,
        ILogger<OracleConnectionFactory> logger)
    {
        _configuration = configuration;
        _logger = logger;
    }

    public IDbConnection CreateConnection(UserRole role)
    {
        var connectionString = role switch
        {
            UserRole.Customer => _configuration.GetConnectionString("OracleCustomer"),
            UserRole.Teller => _configuration.GetConnectionString("OracleTeller"),
            UserRole.Manager => _configuration.GetConnectionString("OracleManager"),
            UserRole.Admin => _configuration.GetConnectionString("OracleAdmin"),
            _ => throw new ArgumentOutOfRangeException(nameof(role))
        };

        _logger.LogDebug("Creating connection for role {Role}", role);

        var connection = new OracleConnection(connectionString);
        return connection;
    }
}

We register this as a singleton in DI since it's stateless — the actual connections come from Oracle's built-in pool per connection string.

Why Dapper Over Entity Framework for Performance Queries

Entity Framework is fine for CRUD. But for our performance-critical queries — account balance lookups, transaction history with complex filtering, real-time dashboard aggregations — Dapper wins hands down.

csharp
public class TransactionRepository : ITransactionRepository
{
    private readonly IDbConnectionFactory _connectionFactory;

    public TransactionRepository(IDbConnectionFactory connectionFactory)
    {
        _connectionFactory = connectionFactory;
    }

    public async Task<PagedResult<TransactionDto>> GetTransactionsAsync(
        TransactionFilter filter,
        UserRole callerRole)
    {
        using var connection = _connectionFactory.CreateConnection(callerRole);

        var sql = @"
            SELECT t.TRANSACTION_ID as TransactionId,
                   t.AMOUNT,
                   t.TRANSACTION_DATE as TransactionDate,
                   t.DESCRIPTION,
                   a.ACCOUNT_NUMBER as AccountNumber,
                   t.STATUS
            FROM TRANSACTIONS t
            INNER JOIN ACCOUNTS a ON t.ACCOUNT_ID = a.ACCOUNT_ID
            WHERE t.ACCOUNT_ID = :AccountId
              AND t.TRANSACTION_DATE BETWEEN :StartDate AND :EndDate
              AND (:Status IS NULL OR t.STATUS = :Status)
            ORDER BY t.TRANSACTION_DATE DESC
            OFFSET :Offset ROWS FETCH NEXT :PageSize ROWS ONLY";

        var countSql = @"
            SELECT COUNT(*)
            FROM TRANSACTIONS t
            WHERE t.ACCOUNT_ID = :AccountId
              AND t.TRANSACTION_DATE BETWEEN :StartDate AND :EndDate
              AND (:Status IS NULL OR t.STATUS = :Status)";

        var parameters = new
        {
            filter.AccountId,
            filter.StartDate,
            filter.EndDate,
            filter.Status,
            Offset = (filter.Page - 1) * filter.PageSize,
            filter.PageSize
        };

        // Execute both queries on the same connection
        var transactions = await connection.QueryAsync<TransactionDto>(sql, parameters);
        var totalCount = await connection.ExecuteScalarAsync<int>(countSql, parameters);

        return new PagedResult<TransactionDto>
        {
            Items = transactions.ToList(),
            TotalCount = totalCount,
            Page = filter.Page,
            PageSize = filter.PageSize
        };
    }
}

The difference is measurable. For our transaction history query with joins across three tables, EF Core generated SQL took ~180ms. The hand-tuned Dapper query: ~45ms. When you're serving thousands of concurrent users, that 4x improvement matters.

Connection Pool Tuning

Oracle's connection pool is per connection string. Here's what I learned tuning it:

Min Pool Size=20;Max Pool Size=100;
Connection Lifetime=300;
Connection Timeout=15;
Incr Pool Size=5;
Decr Pool Size=2;
Validate Connection=true;

Min Pool Size: Set this to your baseline concurrent connections. For the customer tier, analytics showed 20 connections were almost always in use during business hours. Setting this avoids the cold-start penalty of creating connections on demand.

Max Pool Size: This is your ceiling. 100 connections for the customer tier may sound high, but remember — each API request holds a connection for the duration of the query. With 200ms average query time and 500 requests/second, you need 100 connections just to keep up.

Connection Lifetime: Set to 300 seconds (5 minutes). This forces connections to be recycled, which handles Oracle RAC node rebalancing and prevents stale connections after network blips.

Validate Connection: This pings the connection before handing it to your code. It adds ~1ms overhead but prevents the dreaded "ORA-03113: end-of-file on communication channel" errors.

Bulk Operations with Array Binding

For batch processing (end-of-day reconciliation, bulk notification inserts), Oracle's array binding is dramatically faster than individual inserts:

csharp
public async Task BulkInsertNotificationsAsync(
    List<NotificationEntity> notifications,
    UserRole callerRole)
{
    using var connection = (OracleConnection)_connectionFactory
        .CreateConnection(callerRole);
    await connection.OpenAsync();

    using var command = connection.CreateCommand();
    command.CommandText = @"
        INSERT INTO NOTIFICATIONS (NOTIFICATION_ID, USER_ID, MESSAGE, CREATED_AT, STATUS)
        VALUES (:Id, :UserId, :Message, :CreatedAt, :Status)";

    command.ArrayBindCount = notifications.Count;

    command.Parameters.Add(new OracleParameter("Id",
        OracleDbType.Varchar2, notifications.Select(n => n.Id).ToArray(),
        ParameterDirection.Input));
    command.Parameters.Add(new OracleParameter("UserId",
        OracleDbType.Varchar2, notifications.Select(n => n.UserId).ToArray(),
        ParameterDirection.Input));
    command.Parameters.Add(new OracleParameter("Message",
        OracleDbType.Varchar2, notifications.Select(n => n.Message).ToArray(),
        ParameterDirection.Input));
    command.Parameters.Add(new OracleParameter("CreatedAt",
        OracleDbType.Date, notifications.Select(n => n.CreatedAt).ToArray(),
        ParameterDirection.Input));
    command.Parameters.Add(new OracleParameter("Status",
        OracleDbType.Varchar2, notifications.Select(n => n.Status).ToArray(),
        ParameterDirection.Input));

    await command.ExecuteNonQueryAsync();
}

Inserting 10,000 notifications individually: ~45 seconds. With array binding: ~1.2 seconds. That's a 37x improvement.

Health Checks and Monitoring

We expose connection pool health via ASP.NET health checks:

csharp
public class OracleHealthCheck : IHealthCheck
{
    private readonly IDbConnectionFactory _factory;

    public OracleHealthCheck(IDbConnectionFactory factory)
    {
        _factory = factory;
    }

    public async Task<HealthCheckResult> CheckHealthAsync(
        HealthCheckContext context,
        CancellationToken cancellationToken)
    {
        var roles = new[] { UserRole.Customer, UserRole.Teller };
        var results = new Dictionary<string, object>();

        foreach (var role in roles)
        {
            try
            {
                using var connection = _factory.CreateConnection(role);
                await ((OracleConnection)connection).OpenAsync(cancellationToken);

                var poolStats = new
                {
                    IsOpen = connection.State == ConnectionState.Open,
                };

                results[role.ToString()] = poolStats;
            }
            catch (Exception ex)
            {
                return HealthCheckResult.Unhealthy(
                    $"Oracle connection failed for role {role}",
                    ex);
            }
        }

        return HealthCheckResult.Healthy("All Oracle connections healthy",
            results);
    }
}

Lessons Learned

After two years running this architecture in production, the key takeaways are: role-based connection strings are worth the added complexity because they provide real security boundaries, Dapper is not an either/or choice with EF Core and we use both where each excels, connection pool tuning requires production metrics rather than guesswork, and Oracle array binding should be your default for any batch operation over 100 rows.

The Oracle + .NET combination may not win popularity contests, but with the right architecture, it handles enterprise-scale workloads reliably. And in high-compliance environments, reliability trumps novelty.