When it’s good to use a proxy to maintain your zero belief surroundings safe, it usually comes with a price: poor efficiency to your customers. Quickly after deploying a shopper proxy, safety groups are usually slammed with assist tickets from customers annoyed with sluggish browser pace, sluggish file transfers, and video calls glitching at simply the flawed second. After some time, you begin to chalk it as much as the proxy — doubtlessly blinding your self to different points affecting efficiency.
We knew it didn’t need to be this manner. We knew customers may go quicker, with out sacrificing safety, if we fully re-built our method to proxy mode. So we did.
Within the early days of creating the machine shopper for our SASE platform, Cloudflare One, we prioritized common compatibility. When an admin enabled proxy mode, the Consumer acted as a neighborhood SOCKS5 or HTTP proxy. Nonetheless, as a result of our underlying tunnel structure was constructed on WireGuard, a Layer 3 (L3) protocol, we confronted a technical hurdle: get application-layer (L4) TCP site visitors into an L3 tunnel. Transferring from L4 to L3 was particularly troublesome as a result of our desktop Consumer works throughout a number of platforms (Home windows, macOS, Linux) so we couldn’t use the kernel to attain this.
To recover from this hurdle, we used smoltcp, a Rust-based user-space TCP implementation. When a packet hit the native proxy, the Consumer needed to carry out a conversion, utilizing smoltcp to transform the L4 stream into L3 packets for the WireGuard tunnel.
Whereas this labored, it wasn’t environment friendly. Smoltcp is optimized for embedded methods, and doesn’t assist fashionable TCP options. As well as, within the Cloudflare edge, we needed to convert the L3 packets again into an L4 stream. For customers, this manifested as a efficiency ceiling. On media-heavy websites the place a browser would possibly open dozens of concurrent connections for pictures and video, and the dearth of a excessive performing TCP stack led to excessive latency and sluggish load instances when even on high-speed fiber connections, proxy mode felt considerably slower than all the opposite machine shopper modes.
Introducing direct L4 proxying with QUIC
To unravel this, we’ve re-built the Cloudflare One Consumer’s proxy mode from the bottom up and deprecated using WireGuard for proxy mode, so we will capitalize on the capabilities of QUIC. We have been already leveraging MASQUE (a part of QUIC) for proxying IP packets, and added the utilization of QUIC streams for direct L4 proxying.
By leveraging HTTP/3 (RFC 9114) with the CONNECT technique, we will now maintain site visitors at Layer 4, the place it belongs. When your browser sends a SOCKS5 or HTTP request to the Consumer, it’s now not damaged down into L3 packets.
As an alternative, it’s encapsulated instantly right into a QUIC stream.
This architectural shift supplies three quick technical benefits:
Bypassing smoltcp: By eradicating the L3 translation layer, we remove IP packet dealing with and the restrictions of smoltcp’s TCP implementation.
Native QUIC Advantages: We profit from fashionable congestion management and movement management, that are dealt with natively by the transport layer.
Tuneability: The Consumer and Cloudflare’s edge can tune QUIC’s parameters to optimize efficiency.
In our inside testing, the outcomes have been clear: obtain and add speeds doubled, and latency decreased considerably.
Whereas quicker is at all times higher, this replace particularly unblocks three key frequent use circumstances.
First, in coexistence with third-party VPNs the place a legacy VPN remains to be required for particular on-prem assets or the place having a twin SASE setup is required for redundancy/compliance, the native proxy mode is the go-to resolution for including zero belief safety to internet site visitors. This replace ensures that “layering” safety does not imply sacrificing the consumer expertise.
Second, for high-bandwidth software partitioning, proxy mode is commonly used to steer particular browser site visitors by way of Cloudflare Gateway whereas leaving the remainder of the OS on the native community. Customers can now stream high-definition content material or deal with massive datasets with out sacrificing efficiency.
Lastly, builders and energy customers who depend on the SOCKS5 secondary listener for CLI instruments or scripts will see quick enhancements. Distant API calls and information transfers by way of the proxy now profit from the identical low-latency connection as the remainder of the Cloudflare international community.
The proxy mode enhancements can be found with minimal shopper model 2025.8.779.0 for Home windows, macOS, and Linux gadgets. To make the most of these efficiency beneficial properties, guarantee you’re working the newest model of the Cloudflare One Consumer.
Log in to the Cloudflare One dashboard.
Navigate to Groups & Sources > Units > System profiles > Normal profiles.
Choose a profile to edit or create a brand new one and make sure the Service mode is about to Native proxy mode and the System tunnel protocol is about to MASQUE.
You may confirm your energetic protocol on a shopper machine by working the next command in your terminal:
warp-cli settings | grep protocolGo to our documentation for detailed steering on enabling proxy mode to your gadgets.
If you have not began your SASE journey but, you’ll be able to join a free Cloudflare One account for as much as 50 customers at present. Merely create an account, obtain the Cloudflare One Consumer, and comply with our onboarding information to expertise a quicker, extra secure connection to your total group.



