# Graph Mode
Graph mode was the default in Tensorflow 1.0 but has now switched to Eager mode in Tensorflow 2.0. The graph mode is computationally faster but is terrible to code in. So you might want to start off with Eager mode and switch to graph mode to squeeze some extra performance. [[#Autograph]] lets you do that automatically.
## Generating graph mode
Use the decorator `@tf.function` on top of the function you are declaring to declare it in the graph mode.
```python
@tf.function
def add (a,b):
return a + b
```
You can use `print(tf.autograph.to_code(add.python_function))` to see the generated graph code. That returns this
```python
def tf__add(a, b):
with ag__.FunctionScope('add', 'fscope', ag__.ConversionOptions(recursive=True, user_requested=True, optional_features=(), internal_convert_user_code=True)) as fscope:
do_return = False
retval_ = ag__.UndefinedReturnValue()
try:
do_return = True
retval_ = (ag__.ld(a) + ag__.ld(b))
except:
do_return = False
raise
return fscope.ret(retval_, do_return)
```
Any function that is called from within the function annotated with `@tf.function` will also be converted into graph mode. Functions that use a lot of small ops benefit the most from graph mode
1. You can wrap @tf.function to most functions and use the benefits of autograph
2. Autograph only works with `tf.` equivalents of numpy or regular python equations. So use them as much as possble
3. Be careful where you create variables. Variables and ops should be kept separate for autograph. for example th following errors out
```python
@tf.function
def f(x):
v = tf.Variable(1.0)
v.assign_add(x)
return v
# ValueError: tf.function-decorated function tried to create variables on non-first call
```
## Autograph