Peering of IPsec to fortigate appliance in Azure vs VPN gateway

Hi Forti family!

I’m in a situation where i have to choose between peering an IPsec tunnel with an Azure VPN gateway or a fortigate appliance in Azure.I allready know and have a positive experience with the VPN gateway however i’m attempting to redefine/restructure our azure network design. For various reasons it would be prefered to do an IPsec peering directly with the fortigate appliance rather than the VPN gateway, However my colleague stated that he approximately 2 years ago attempted to do IPsec peerings directly with a fortigate in azure and had odd experiences and or issues with this. (he can however not remember specificly what the issues were.). I do not want to discredit my colleague, but i find it hard to believe that this is not possible and/or functional.So i turn to you guys, hoping there is some of you out there who has experience with peering an IPsec to a fortigate appliance in azure. Can you recommend doing this or do you advocate for sticking with the VPN gateway?

EDIT: we intend on running an active/passive setup with both internal and external loadbalancer.

We have this kind of peering setup and have zero issues.

If your budget allows setup a FortiGate in Azure. Then I suggest going that route. Best part, you can utilize SDWAN to provide load-balancing/failover for your connection. Suree the Azure gateway can also technically do that, but nothing beats reliability between FG to FG

Additionally, we’ve already encounter issues with a number of clients in terms of stability with the tunnel when terminating with the Azure gateway. One of our larger clients, even had an azure engineer go onsite due to the problem already lasting for already a month.

You know what guaranteed stability? Setting up a FortiGate in Azure and peering with it.

I rarely see any issues creating/maintaining an IPsec tunnel to Azure. As it sounds like you are making these changes to a prod environment, be sure to read through FortiDocs and Azure Documentation before making any changes.

https://docs.fortinet.com/document/fortigate/7.4.1/administration-guide/255100/ipsec-vpn-to-azure-with-virtual-network-gateway

Are you running a singular fortigate appliance in azure or potentially an active/passive setup?

Our intention is to run an active/passive setup with elb and ilb maybe i should have specified that in the post aswell.

It’s a very small migration and everything is pretty much built beforehand and we have plenty of experience with peering with a virtual network gateway. but thanks for the input anyways :slight_smile:

Currently, all of the clients I’ve worked with have one FG in Azure
I’d recommend looking at this document check to see if your desired setup matches what its in the diagram

PS: you can click the deployment types which will redirect you to a detailed guide in github

I am sure there are people that have not as many issues and will tell you that having active/passive in the cloud has pro’s.

Personally, I have only had con’s - however, I am surely far from the majority.
I argue that the complexity of a cluster in a cloud is far higher than the benefit (and costs).

In order to have a true cluster, you need redundancy - meaning the cluster nodes shouldn’t run in the same “area”. So that will possibly make things more complex in terms of networking and latency, etc.

And even then you might not be immune to global cloud issues.

In my humble opinion it is not worth it to run a cluster in the cloud - a singlebox will do. Just make sure you test it, make backups (from the fortigate as well as from the VM) - so you have BOTH options of restore a vm or, in the worst case, redeploy.

However, I have yet to see a scenario where a cluster really helps you (again, unless you have multi area redundancy) against something - if the cloud in itself acts up, the cluster won’t help you, when you reconfigure something wrongly it will likely spread to the other node and a cluster won’t help you.

A “cluster” with two singlebox loadbalanced by some cloud loadbalancer might be the better bet (it has its challenges for sure, but not the cluster ones).

I’ve allready visited this documentation and have a Active/passive cluster deployed ready for the PoC it was just to specify this detail as it might have something to say in terms of the IPsec VPN having to be peered via. the ELB.

But thanks alot for the response i’ll give it a shot and maybe do a few tests to see how the VPN behaves, i’m just on a very short deadline for a project so just wanted to spare myself the headaches and draw upon others experiences :slight_smile:

there could definitely be some points made againt having a cluster of fortigate in the cloud, however we have allready gone back and fourth in terms of the benefits an downsides, and believe the benefits outweighsthe costs vs the criticality of the environment we administrate. And zones have definitely be taken into account aswell.
but thanks for input :slight_smile:

How did you get on with this? I’m facing the same situation and struggling to find any documentation detailing the solution.

Sorry for the late response. We created a network hub subscription and deployed a fortigate active/passive cluster with elb/ilb. Then in this same subscription we have a vnetgateway/localnetwork gateway ipsec peering towards on-prem. So all traffic between vnets run through firewalls, and the same with wan facing traffic. All traffic towards on-prem go via the gateway through an on-prem firewall. All spoke subscriptions have a vnet peering into the hub and the respective subnets have a default route, routing traffic towards the forrigate ilb with propagate gateway routes enabled to ensure routing towards on-prem via ipsec. It’s maybe a bit more info than you need og requested but that is how we decided to go about it. Make sure to do some reading and spend time on it before diving in :slight_smile: