# Sample flipper scripts¶

Here are several small sample scripts that demonstrate some of the features of flipper.

## Mapping classes¶

We can compute some basic properties of a mapping class:

import flipper

word = 'aCBACBacbaccbAaAcAaBBcCcBBcCaBaaaABBabBcaBbCBCbaaa'

h = S.mapping_class(word)
print('Built the mapping class h := \'%s\'.' % word)

print('h has order %s (where 0 == infinite).' % h.order())
print('h is %s.' % h.nielsen_thurston_type())

try:
print('h leaves L := %s projectively invariant.' % h.invariant_lamination().projective_string())
print('and dilates it by a factor of %s.' % h.dilatation())
except flipper.AssumptionError:
print('We cannot find a projectively invariant lamination for h as it is not pseudo-Anosov.')



## All words¶

Flipper can systematically generate all words in a given generating set. This is useful for exhaustively searching for mapping classes with rare properties:

from time import time
import flipper

length = 7
S = flipper.load('S_1_2')  # Get an EquippedTriangulation.

start_time = time()
all_words = list(S.all_words(length))
print('Built %d words in %0.3fs.' % (len(all_words), time() - start_time))

# In parallel:
start_time = time()
all_words2 = list(S.all_words(length, cores=2))
print('Built %d words in %0.3fs.' % (len(all_words2), time() - start_time))

assert len(all_words) == len(set(all_words)) and set(all_words) == set(all_words2)



## Invariant laminations¶

We can see just how good flipper is at finding invariant laminations:

from time import time
import flipper

times = {}
surface = 'S_3_1'
length = 20
num_samples = 100

for index in range(num_samples):
monodromy = S.random_word(length)
h = S.mapping_class(monodromy)
start_time = time()
try:
h.invariant_lamination()
times[monodromy] = time() - start_time
print('%3d/%d: %s %s, Time: %0.3f' % (index+1, num_samples, surface, monodromy, times[monodromy]))
except flipper.AssumptionError:
times[monodromy] = time() - start_time
print('%3d/%d: %s %s, not pA, Time: %0.3f' % (index+1, num_samples, surface, monodromy, times[monodromy]))

print('Average time: %0.3f' % (sum(times.values()) / num_samples))
print('Slowest: %s, Time: %0.3f' % (max(times, key=lambda w: times[w]).replace('.', ''), max(times.values())))
print('Total time: %0.3f' % sum(times.values()))



## Pseudo-Anosov distributions¶

Since flipper can determine the Nielsen–Thupyon type of a mapping class we can use it to explore how the percentage of pseudo-Anosovs grows with respect to word length:

import flipper

length = 10
num_samples = 100

for i in range(length):
pA_samples = sum(1 if S.mapping_class(i).is_pseudo_anosov() else 0 for _ in range(num_samples))
print('Length %d: %0.1f%% pA' % (i, float(pA_samples) * 100 / num_samples))



## Conjugacy classes¶

Flipper can partition pseudo-Anosov mapping classes into conjugacy classes:

import flipper

length = 6

buckets = []  # All the different conjugacy classes that we have found.
# We could order the buckets by something, say dilatation.
for index, word in enumerate(S.all_words(length)):
h = S.mapping_class(word)
# Currently, we can only determine conjugacy classes for
# pseudo-Anosovs, so we had better filter by them.
if h.is_pseudo_anosov():
# Check if this is conjugate to a mapping class we have seen.
for bucket in buckets:
# Conjugacy is transitive, so we only have to bother checking
# if h is conjugate to the first entry in the bucket.
if bucket.is_conjugate_to(h):
bucket.append(h)
break
else:  # We have found a new conjugacy class.
buckets.append([h])
print('%d words in %d conjugacy classes.' % (index, len(buckets)))

print(buckets)



## Bundles¶

Flipper can interface with SnapPy to build the mapping tori associated to a mapping class. When the mapping class is pseudo-Anosov, flipper builds Agol’s veering triangulation of the fulling punctured mapping torus and installs the correct Dehn fillings:

import snappy
import flipper

# A pseudo-Anosov mapping class.

# Build Agol's veering triangulation of the bundle.
# This will fail with an AssumptionError if h is not pseudo-Anosov.
bundle = h.bundle()

print('It has %d cusp(s) with the following properties:' % bundle.triangulation3.num_cusps)
for index, (real, fibre, degeneracy) in enumerate(zip(bundle.cusp_types(), bundle.fibre_slopes(), bundle.degeneracy_slopes())):
print('\tCusp %s (%s): Fibre slope %s, degeneracy slope %s' % (index, 'Real' if real else 'Fake', fibre, degeneracy))

# Fake cusps filled.
M = snappy.Manifold(bundle)
print(M.identify())

# Can also build a non-veering triangulation of the bundle.
# This works for all mapping classes.
M2 = snappy.Manifold(h.bundle(veering=False))
print(M2.identify())

# If we don't fill the fake cusps we may get a differnt manifold.
N = snappy.Manifold(bundle.snappy_string(filled=False))
print(N.identify())



## Twister¶

We can check that the mapping tori built by Twister and flipper agree:

import snappy
import flipper

def match(surface, monodromy):
M = snappy.twister.Surface(surface).bundle(monodromy)
return M.is_isometric_to(N)

assert match('S_1_1', 'aB')
assert match('S_1_2', 'abC')
assert match('S_2_1', 'abbbCddEdaa')


## Censuses¶

Flipper includes large censuses of monodromies for fibred knots and manifolds:

from time import time
import snappy
import flipper

for _, row in flipper.census('CHW').iterrows():
start_time = time()
M = snappy.Manifold(row.manifold)
assert M.is_isometric_to(N)  # Never fails for these examples.
print('Matched %s over %s with %s in %0.3fs.' % (row.monodromy, row.surface, row.manifold, time() - start_time))



## Knot cusp orders¶

Flipper can find fibred knots where the stable lamination has two (6_2) or even one (8_20) prong coming out of the knot:

import flipper

for _, row in flipper.census('knots').iterrows():
vertex_orders = [stratum[singularity] for singularity in stratum]
real_vertex_orders = [stratum[singularity] for singularity in stratum if not singularity.filled]
print('%s (%s over %s) has singularities %s with %s real singularities.' % (row.manifold, row.monodromy, row.surface, vertex_orders, real_vertex_orders))



## Hard invariant laminations¶

There is also a database of mapping classes that flipper has previously had a hard time finding invariant laminations for. These may be useful test cases for other pieces of software or be worth exploring for interesting mathematical properties:

from time import time
import flipper

times = {}

examples = flipper.census('hard')

for index, row in examples.iterrows():
start_time = time()
try:
h.invariant_lamination()
times[row.monodromy] = time() - start_time
print('%3d/%d: %s %s, Time: %0.3f' % (index+1, len(examples), row.surface, row.monodromy, times[row.monodromy]))
except flipper.AssumptionError:
times[row.monodromy] = time() - start_time
print('%3d/%d: %s %s, not pA, Time: %0.3f' % (index+1, len(examples), row.surface, row.monodromy, times[row.monodromy]))

print('Average time: %0.3f' % (sum(times.values()) / len(examples)))
print('Slowest: %s, Time: %0.3f' % (max(times, key=lambda w: times[w]), max(times.values())))
print('Total time: %0.3f' % sum(times.values()))