The Litepresence Report on Cryptocurrency

I dont think we will go to moon prices again anytime soon. It's frightening to say the least. I mean coins like PPC and NMC are at 1 dollars and PPC is about to breach 0,x. If it does, it might straight out explode?

Anyways, they are clearly being dumped.

Any reason you believe that or are you just speculating? Not that there's anything wrong with that, just wondering why you think that.

The altcoins maybe dead, but that doesn't mean BTC, the original crypto-currency, has to follow their lead. I, for one, have been paying zero attention to alt coins. I feel like very few people care about them anymore.
 
Any reason you believe that or are you just speculating? Not that there's anything wrong with that, just wondering why you think that.

The altcoins maybe dead, but that doesn't mean BTC, the original crypto-currency, has to follow their lead. I, for one, have been paying zero attention to alt coins. I feel like very few people care about them anymore.

I have no idea I just got this feeling. This arent looking good. I bought PPC @ 2.5 dollars and I fear i will never see them again
 
I have no idea I just got this feeling. This arent looking good. I bought PPC @ 2.5 dollars and I fear i will never see them again

Sorry to hear that. Sometimes, though, that sinking feeling happens right when the market reaches a bottom because despair has set in and the market is free from the fear that was fueling the drop.

Since you are human, you often experience these feelings at the same time as other people do, which is precisely at the point when most people give up and stop driving the price down by panic-selling.

Be greedy when others are fearful, be fearful when others are greedy. I see a lot of fear right now. This makes me hopeful because, despite all this, the price really hasn't declined that much, and yet people are constantly giving doomsday forecasts.
 
Last edited:
Sorry to hear that. Sometimes, though, that sinking feeling happens right when the market reaches a bottom because despair has set in and the market is free from the fear that was fueling the drop.

Since you are human, you often experience these feelings at the same time as other people do, which is precisely at the point when most people give up and stop driving the price down by panic-selling.

Be greedy when others are fearful, be fearful when others are greedy. I see a lot of fear right now. This makes me hopeful because, despite all this, the price really hasn't declined that much, and yet people are constantly giving doomsday forecasts.

Greedy as in I should buy more? I mean the prices are very low now.
 
Greedy as in I should buy more? I mean the prices are very low now.

I keep buying periodically. Unless a person is a really good trader, people are going to look back on this and feel pretty silly caring about $30 swings after the next bubble.
 
The BitLicense FUD has been somewhat relentless on the inside. If you want to go entirely by charting, something needs to happen soon or it may be a boring sideways period for another year yet like the period from 2011-2012.

Too much stuff in the pipeline for me to believe the latter...but anything is possible...
 
I turned tv on late this afternoon, and streaming across the FOX news channel was: Consumer Financial Protection Bureau will now begin accepting complaints about Bitcoin and other similar exchanges, which are not backed by the Government. So it appears in my view, the Feds are going to go after and shut down bitcoin exchanges, when actually they probably want all the btc. They may have caused the FUD.
 
Last edited:
def update_external_data():

storage.trollbox = storage.get('trollbox', [])
storage.trollbox = get_text('http://trollboxarchive.com/')

def tick():

if info.tick == 0:

litecoin = storage.trollbox.count('litecoin')
bitcoin = storage.trollbox.count('bitcoin')
LTC = storage.trollbox.count('LTC')
BTC = storage.trollbox.count('BTC')
moon = storage.trollbox.count('moon')
crash = storage.trollbox.count('crash')

log('bitcoin: %s' % bitcoin)
log('litecoin: %s' % litecoin)
log('BTC: %s' % BTC)
log('LTC: %s' % LTC)
log('moon: %s' % moon)
log('crash: %s' % crash)









Waiting... Saved successfully.
[2014-04-01 13:00:00] bitcoin: 5
[2014-04-01 13:00:00] litecoin: 2
[2014-04-01 13:00:00] BTC: 5
[2014-04-01 13:00:00] LTC: 13
[2014-04-01 13:00:00] moon: 2
[2014-04-01 13:00:00] crash: 3


https://discuss.tradewave.net/t/external-http-requests-in-strategies/317
 
Last edited:
Fourier Analysis
v2.11 with commentary

INSTRUMENT = pairs.btc_usd
FOURIER_LOG = False
FOURIER_PLOT = True
AGGREGATION = 3600
PERIOD = 30
SIFT = 5 # integer from 1 to PERIOD/2

# manifest a definition of FFT with 5 arguements
def fourier_series(array, period, sift, fourier_log, fourier_plot):
import math
import numpy as np

# Create an array of length period of closing data; then reverse its order
signal = (array[-period:])[::-1]

# Use scipy to extract amplitude and phase in (a + bi) complex form
complex_fft = np.fft.fft(signal)

''' Calculate amplitude, phase, frequency, and velocity '''
# define empty lists for later use
amplitude = []
phase = []
frequency = []
velocity = []

# extract real and imaginary coefficients from complex scipy output
for n in range(period, 0, -1):
amplitude.append(complex_fft.real[-n])
phase.append(complex_fft.imag[-n])

# The final equation will need to be divided by period
# I do it here so that it is calculated once saving cycles
amplitude = [(x/period) for x in amplitude]

# Extract the carrier
carrier = max(amplitude)

# The frecuency is a helper function of scipy fft see
# It only has access to the length of the data set
frequency.append(np.fft.fftfreq(signal.size, 1))

# Convert frequency array to list
frequency = frequency[-1]

# Velocity is just 2*pi*frequency; I do this here once to save cycle time
velocity = [x*2*math.pi for x in frequency]

''' Calculate the Full Spectrum Sinusoid '''
# Here we recombine ALL elements in the form An*sin(2*pi(Fn) + Pn) for the full spectrum
full_spectrum = 0
for m in range(1, period+1):
full_spectrum += amplitude[-m]*(1+math.sin(velocity[-m] + phase[-m]))

''' Calculate the Filtered Sinusoid '''
# Normalize user sift input as an integer
sift = int(sift)

# If sift is more than half of the period return full spectrum
if sift >= period/2:
filtered_transform = full_spectrum

# If sift is 0 or 1 return the carrier
else:
filtered_transform = carrier

# For every whole number of sift over 1, but less than half the period:
# Add an 2 elements to the sinusoid respective of
# a negative and positive frequency pair
if sift > 1:
for m in range(1, sift):
p = period - m
filtered_transform += amplitude[-m]*(1+math.sin(velocity[-m] + phase[-m]))
filtered_transform += amplitude[-p]*(1+math.sin(velocity[-p] + phase[-p]))

# Format array data and log
if fourier_log:
log('**********************************')
log('Carrier: %.3f' % amplitude[-period])
log(['%.2f' % x for x in amplitude])
log(['%.2f' % x for x in velocity])
log(['%.2f' % x for x in phase])

# Plot the carrier wave and full spectrum sinusoid
if fourier_plot:
plot('Carrier', carrier)
plot('Full_spectrum', full_spectrum)

# Return the sifted FFT
return filtered_transform

def tick():

close = data(interval=AGGREGATION)[INSTRUMENT].warmup_period('close')

# Call FFT definition
y = fourier_series(close, PERIOD, SIFT, FOURIER_LOG, FOURIER_PLOT)

# Plot the filtered transform
plot('Filtered_transform', y)

https://discuss.tradewave.net/t/scipy-fourier-analysis/270/20
 
Parabolic SAR v5.0

Hashed Custom Namespace

import time

PAIR = pairs.btc_usd
SAR_PLOT = True # False turns off SAR plot
SAR_AGGREGATION = 3600 # Match to tick = no auto adjust
SAR_SENSITIVITY = 1 # whole, default 2
SAR_RISE_LOOKBACK = 0 # 0 = MAX, else whole # period
SAR_RISE_INITIAL = 0.02 # Inital Rising Acceleration
SAR_RISE_ACCELERATION = 0.02 # Rising Acceleration
SAR_RISE_MAX = 0.2 # Maximum Rising Acceleration
SAR_FALL_LOOKBACK = 0 # 0 = MAX, else whole # period
SAR_FALL_INITIAL = 0.02 # Initial Falling Acceleration
SAR_FALL_ACCELERATION = 0.02 # Falling Acceleration
SAR_FALL_MAX = 0.2 # Maximum Falling Acceleration

def parabolic_sar(pair, sar_plot, aggregation, sensitivity,
rise_lookback, rise_initial, rise_acceleration, rise_max,
fall_lookback, fall_initial, fall_acceleration, fall_max):

import time
import math

start_time = time.time()

''' Seed psuedo random 24 character hash string '''
fall_hash = 2**(1/2.0)*fall_max + 3**(1/2.0)*fall_acceleration + 5**(1/2.0)*(fall_initial+1)
rise_hash = 2**(1/3.0)*rise_max + 3**(1/3.0)*rise_acceleration + 5**(1/3.0)*(rise_initial+1)
lookback_hash = 2**(1/2.0)*fall_lookback + 3**(1/2.0)*rise_lookback + 5**(1/2.0)*sensitivity
pair_agg_hash = int(10**25*pair**(1/2.0) / aggregation**(1/2.0))
sar_hash = str(int(10**25*Decimal((fall_hash+lookback_hash)/rise_hash))+pair_agg_hash)[-24:]

''' Initialize Stored Variables '''
extreme_point = 'parabolic_sar_extreme_point_'+sar_hash
acceleration = 'parabolic_sar_acceleration_'+sar_hash
direction = 'parabolic_sar_direction_'+sar_hash
previous = 'parabolic_sar_previous_'+sar_hash
storage[extreme_point] = storage.get(extreme_point, 0)
storage[acceleration] = storage.get(acceleration, 0)
storage[direction] = storage.get(direction, 0)
storage[previous] = storage.get(previous, 0)


''' Auto Adjust Thresholds Based on Aggregation '''
power = 1.235 # Bigger reduces SAR crosses
aggregation_ratio = aggregation/float(info.interval)
power_ratio = aggregation_ratio**power

sensitivity = int(math.ceil(sensitivity))
rise_lookback = int(rise_lookback*aggregation_ratio)
rise_initial = rise_initial/power_ratio
rise_acceleration = rise_acceleration/power_ratio
rise_max = rise_max*aggregation_ratio
fall_lookback = int(fall_lookback*aggregation_ratio)
fall_initial = fall_initial/power_ratio
fall_acceleration = fall_acceleration/power_ratio
fall_max = fall_max*aggregation_ratio

''' Prevent Rattle on 1m ticks '''
offset = 0
if info.interval == 60:
fall_initial = 0
rise_initial = 0
offset = 0.002

''' Log Adjusted Thresholds and persistent variable names on 1st tick '''
if 1:
if info.tick == 0:
log('storage.' + extreme_point)
log('storage.' + acceleration)
log('storage.' + direction)
log('storage.' + previous)
log('tick size....: %s' % info.interval)
log('aggregation..: %s' % aggregation)
log('agg_ratio....: %s' % aggregation_ratio)
log('power........: %s' % power)
log('power_ratio..: %.2f' % power_ratio)
log('sensitivity..: %s' % sensitivity)
log('rise_lookback: %s' % rise_lookback)
log('rise_initial.: %s' % rise_initial)
log('rise_accel...: %s' % rise_acceleration)
log('rise_max.....: %s' % rise_max)
log('fall_lookback: %s' % fall_lookback)
log('fall_initial.: %s' % fall_initial)
log('fall_accel...: %s' % fall_acceleration)
log('fall_max.....: %s' % fall_max)
log('offset.......: %s' % offset)

''' High, Low, and Close '''
high = data(interval=aggregation)[pair].period(2, 'high')
low = data(interval=aggregation)[pair].period(2, 'low')
close = data(interval=aggregation)[pair].period(2, 'close')

''' Build array of candles to look for SAR cross '''
low_array = []
high_array = []
for z in range(sensitivity, 0, -1):
low_array.append(low[-z])
high_array.append(high[-z])

''' Determine if inital SAR is Rising or Falling '''
if info.tick == 0:
if close[-1] > close[-2]:
storage[direction] = 1
storage[previous] = low[-2]
storage[extreme_point] = max(high_array)
storage[acceleration] = rise_initial
else:
storage[direction] = -1
storage[previous] = high[-2]
storage[extreme_point] = min(low_array)
storage[acceleration] = -fall_initial

''' Calculate Rising SAR '''
# Define New SAR
if storage[direction] == 1:
sar = storage[previous] + storage[acceleration]*(
storage[extreme_point] - storage[previous])
# Update acceleration factor if EP is breached
if high[-1] > storage[extreme_point]:
storage[extreme_point] = high[-1]
storage[acceleration] = storage[acceleration] + rise_acceleration
if storage[acceleration] > rise_max:
storage[acceleration] = rise_max
# Define lookback price based on period
if fall_lookback == 0:
lookback = storage[extreme_point]
else:
lookback = []
for z in range(fall_lookback):
lookback.append(high[-(z+1)])
lookback = max(lookback)
lookback = min(lookback, storage[extreme_point])
# If new SAR cross, then Stop and Reverse
if min(low_array) < sar:
storage[direction] = -2
storage[acceleration] = -fall_initial
sar = lookback * float(1 + offset)
log('***** SAR CROSS *****')

''' Calculate Falling SAR '''
# Define New SAR
if storage[direction] == -1:
sar = storage[previous] + storage[acceleration]*(
storage[previous] - storage[extreme_point])
# note storage.AF is negative in this instance
# Update acceleration factor if EP is breached
if low[-1] < storage[extreme_point]:
storage[extreme_point] = low[-1]
storage[acceleration] = storage[acceleration] - fall_acceleration
if storage[acceleration] < -fall_max:
storage[acceleration] = -fall_max
# Define lookback price based on period
if rise_lookback == 0:
lookback = storage[extreme_point]
else:
lookback = []
for z in range(rise_lookback):
lookback.append(L[-(z+1)])
lookback = min(lookback)
lookback = max(lookback, storage[extreme_point])
# If new SAR cross, then Stop and Reverse
if max(high_array) > sar:
storage[direction] = 1
storage[acceleration] = rise_initial
sar = lookback * float(1 - offset)
log('***** SAR CROSS *****')

''' Update Direction and Prior SAR '''
if storage[direction] == -2:
storage[direction] = -1
storage[previous] = sar

''' Plot Parabolic SAR '''
if sar_plot:
plot('low', low[-1])
plot('high', high[-1])
plot('Parabolic_SAR', sar)

''' Log Tick Time '''
finish_time = time.time()
tick_time = round((finish_time - start_time), 5)
#log(tick_time)

return sar


def tick():

if info.tick == 0:
storage.begin_time = time.time()

z = parabolic_sar(PAIR, SAR_PLOT, SAR_AGGREGATION, SAR_SENSITIVITY,
SAR_RISE_LOOKBACK, SAR_RISE_INITIAL, SAR_RISE_ACCELERATION, SAR_RISE_MAX,
SAR_FALL_LOOKBACK, SAR_FALL_INITIAL, SAR_FALL_ACCELERATION, SAR_FALL_MAX)


#log(('Parabolic SAR: %.2f') % z)

''' Log Total Backtest Time '''
def stop():

end_time = time.time()
run_time = float(end_time - storage.begin_time)
log('Total Run Time: %.2f' % run_time)

'''
RISING SAR

Prior SAR: The SAR value for the previous period.
Extreme Point: The highest high of the current uptrend.
Acceleration Factor: Starting at .02, AF increases by .02 each
time the extreme point makes a new high. AF can reach a maximum
of .20, no matter how long the uptrend extends.

Current RISING SAR = Prior SAR + Prior AF(Prior EP - Prior SAR)

The Acceleration Factor is multiplied by the difference between the
Extreme Point and the prior period's SAR. This is then added to the
prior period's SAR. Note however that Rising SAR can never be above the
prior two periods' lows. Should SAR be above one of those lows, use
the lowest of the two for SAR.

FALLING SAR

Prior SAR: The SAR value for the previous period.
Extreme Point: The lowest low of the current downtrend.
Acceleration Factor (AF): Starting at .02, AF increases by .02 each
time the extreme point makes a new low. AF can reach a maximum
of .20, no matter how long the downtrend extends.

Current FALLING SAR = Prior SAR - Prior AF(Prior SAR - Prior EP)

The Acceleration Factor is multiplied by the difference between the
Prior period's SAR and the Extreme Point. This is then subtracted
from the prior period's SAR. Note however that Falling SAR can never be
below the prior two periods' highs. Should SAR be below one of
those highs, use the highest of the two for SAR.
'''

https://discuss.tradewave.net/t/parabolic-sar-custom-without-ta-lib/220/45
 
Parabolic SAR v5.0

Hashed Custom Namespace

import time

PAIR = pairs.btc_usd
SAR_PLOT = True # False turns off SAR plot
SAR_AGGREGATION = 3600 # Match to tick = no auto adjust
SAR_SENSITIVITY = 1 # whole, default 2
SAR_RISE_LOOKBACK = 0 # 0 = MAX, else whole # period
SAR_RISE_INITIAL = 0.02 # Inital Rising Acceleration
SAR_RISE_ACCELERATION = 0.02 # Rising Acceleration
SAR_RISE_MAX = 0.2 # Maximum Rising Acceleration
SAR_FALL_LOOKBACK = 0 # 0 = MAX, else whole # period
SAR_FALL_INITIAL = 0.02 # Initial Falling Acceleration
SAR_FALL_ACCELERATION = 0.02 # Falling Acceleration
SAR_FALL_MAX = 0.2 # Maximum Falling Acceleration

def parabolic_sar(pair, sar_plot, aggregation, sensitivity,
rise_lookback, rise_initial, rise_acceleration, rise_max,
fall_lookback, fall_initial, fall_acceleration, fall_max):

import time
import math

start_time = time.time()

''' Seed psuedo random 24 character hash string '''
fall_hash = 2**(1/2.0)*fall_max + 3**(1/2.0)*fall_acceleration + 5**(1/2.0)*(fall_initial+1)
rise_hash = 2**(1/3.0)*rise_max + 3**(1/3.0)*rise_acceleration + 5**(1/3.0)*(rise_initial+1)
lookback_hash = 2**(1/2.0)*fall_lookback + 3**(1/2.0)*rise_lookback + 5**(1/2.0)*sensitivity
pair_agg_hash = int(10**25*pair**(1/2.0) / aggregation**(1/2.0))
sar_hash = str(int(10**25*Decimal((fall_hash+lookback_hash)/rise_hash))+pair_agg_hash)[-24:]

''' Initialize Stored Variables '''
extreme_point = 'parabolic_sar_extreme_point_'+sar_hash
acceleration = 'parabolic_sar_acceleration_'+sar_hash
direction = 'parabolic_sar_direction_'+sar_hash
previous = 'parabolic_sar_previous_'+sar_hash
storage[extreme_point] = storage.get(extreme_point, 0)
storage[acceleration] = storage.get(acceleration, 0)
storage[direction] = storage.get(direction, 0)
storage[previous] = storage.get(previous, 0)


''' Auto Adjust Thresholds Based on Aggregation '''
power = 1.235 # Bigger reduces SAR crosses
aggregation_ratio = aggregation/float(info.interval)
power_ratio = aggregation_ratio**power

sensitivity = int(math.ceil(sensitivity))
rise_lookback = int(rise_lookback*aggregation_ratio)
rise_initial = rise_initial/power_ratio
rise_acceleration = rise_acceleration/power_ratio
rise_max = rise_max*aggregation_ratio
fall_lookback = int(fall_lookback*aggregation_ratio)
fall_initial = fall_initial/power_ratio
fall_acceleration = fall_acceleration/power_ratio
fall_max = fall_max*aggregation_ratio

''' Prevent Rattle on 1m ticks '''
offset = 0
if info.interval == 60:
fall_initial = 0
rise_initial = 0
offset = 0.002

''' Log Adjusted Thresholds and persistent variable names on 1st tick '''
if 1:
if info.tick == 0:
log('storage.' + extreme_point)
log('storage.' + acceleration)
log('storage.' + direction)
log('storage.' + previous)
log('tick size....: %s' % info.interval)
log('aggregation..: %s' % aggregation)
log('agg_ratio....: %s' % aggregation_ratio)
log('power........: %s' % power)
log('power_ratio..: %.2f' % power_ratio)
log('sensitivity..: %s' % sensitivity)
log('rise_lookback: %s' % rise_lookback)
log('rise_initial.: %s' % rise_initial)
log('rise_accel...: %s' % rise_acceleration)
log('rise_max.....: %s' % rise_max)
log('fall_lookback: %s' % fall_lookback)
log('fall_initial.: %s' % fall_initial)
log('fall_accel...: %s' % fall_acceleration)
log('fall_max.....: %s' % fall_max)
log('offset.......: %s' % offset)

''' High, Low, and Close '''
high = data(interval=aggregation)[pair].period(2, 'high')
low = data(interval=aggregation)[pair].period(2, 'low')
close = data(interval=aggregation)[pair].period(2, 'close')

''' Build array of candles to look for SAR cross '''
low_array = []
high_array = []
for z in range(sensitivity, 0, -1):
low_array.append(low[-z])
high_array.append(high[-z])

''' Determine if inital SAR is Rising or Falling '''
if info.tick == 0:
if close[-1] > close[-2]:
storage[direction] = 1
storage[previous] = low[-2]
storage[extreme_point] = max(high_array)
storage[acceleration] = rise_initial
else:
storage[direction] = -1
storage[previous] = high[-2]
storage[extreme_point] = min(low_array)
storage[acceleration] = -fall_initial

''' Calculate Rising SAR '''
# Define New SAR
if storage[direction] == 1:
sar = storage[previous] + storage[acceleration]*(
storage[extreme_point] - storage[previous])
# Update acceleration factor if EP is breached
if high[-1] > storage[extreme_point]:
storage[extreme_point] = high[-1]
storage[acceleration] = storage[acceleration] + rise_acceleration
if storage[acceleration] > rise_max:
storage[acceleration] = rise_max
# Define lookback price based on period
if fall_lookback == 0:
lookback = storage[extreme_point]
else:
lookback = []
for z in range(fall_lookback):
lookback.append(high[-(z+1)])
lookback = max(lookback)
lookback = min(lookback, storage[extreme_point])
# If new SAR cross, then Stop and Reverse
if min(low_array) < sar:
storage[direction] = -2
storage[acceleration] = -fall_initial
sar = lookback * float(1 + offset)
log('***** SAR CROSS *****')

''' Calculate Falling SAR '''
# Define New SAR
if storage[direction] == -1:
sar = storage[previous] + storage[acceleration]*(
storage[previous] - storage[extreme_point])
# note storage.AF is negative in this instance
# Update acceleration factor if EP is breached
if low[-1] < storage[extreme_point]:
storage[extreme_point] = low[-1]
storage[acceleration] = storage[acceleration] - fall_acceleration
if storage[acceleration] < -fall_max:
storage[acceleration] = -fall_max
# Define lookback price based on period
if rise_lookback == 0:
lookback = storage[extreme_point]
else:
lookback = []
for z in range(rise_lookback):
lookback.append(L[-(z+1)])
lookback = min(lookback)
lookback = max(lookback, storage[extreme_point])
# If new SAR cross, then Stop and Reverse
if max(high_array) > sar:
storage[direction] = 1
storage[acceleration] = rise_initial
sar = lookback * float(1 - offset)
log('***** SAR CROSS *****')

''' Update Direction and Prior SAR '''
if storage[direction] == -2:
storage[direction] = -1
storage[previous] = sar

''' Plot Parabolic SAR '''
if sar_plot:
plot('low', low[-1])
plot('high', high[-1])
plot('Parabolic_SAR', sar)

''' Log Tick Time '''
finish_time = time.time()
tick_time = round((finish_time - start_time), 5)
#log(tick_time)

return sar


def tick():

if info.tick == 0:
storage.begin_time = time.time()

z = parabolic_sar(PAIR, SAR_PLOT, SAR_AGGREGATION, SAR_SENSITIVITY,
SAR_RISE_LOOKBACK, SAR_RISE_INITIAL, SAR_RISE_ACCELERATION, SAR_RISE_MAX,
SAR_FALL_LOOKBACK, SAR_FALL_INITIAL, SAR_FALL_ACCELERATION, SAR_FALL_MAX)


#log(('Parabolic SAR: %.2f') % z)

''' Log Total Backtest Time '''
def stop():

end_time = time.time()
run_time = float(end_time - storage.begin_time)
log('Total Run Time: %.2f' % run_time)

'''
RISING SAR

Prior SAR: The SAR value for the previous period.
Extreme Point: The highest high of the current uptrend.
Acceleration Factor: Starting at .02, AF increases by .02 each
time the extreme point makes a new high. AF can reach a maximum
of .20, no matter how long the uptrend extends.

Current RISING SAR = Prior SAR + Prior AF(Prior EP - Prior SAR)

The Acceleration Factor is multiplied by the difference between the
Extreme Point and the prior period's SAR. This is then added to the
prior period's SAR. Note however that Rising SAR can never be above the
prior two periods' lows. Should SAR be above one of those lows, use
the lowest of the two for SAR.

FALLING SAR

Prior SAR: The SAR value for the previous period.
Extreme Point: The lowest low of the current downtrend.
Acceleration Factor (AF): Starting at .02, AF increases by .02 each
time the extreme point makes a new low. AF can reach a maximum
of .20, no matter how long the downtrend extends.

Current FALLING SAR = Prior SAR - Prior AF(Prior SAR - Prior EP)

The Acceleration Factor is multiplied by the difference between the
Prior period's SAR and the Extreme Point. This is then subtracted
from the prior period's SAR. Note however that Falling SAR can never be
below the prior two periods' highs. Should SAR be below one of
those highs, use the highest of the two for SAR.
'''

https://discuss.tradewave.net/t/parabolic-sar-custom-without-ta-lib/220/45

Could you clarify some of that? Is there a main bullet point to all of that?
 
So it appears in my view, the Feds are going to go after and shut down bitcoin exchanges, when actually they probably want all the btc.

If they wanted "all teh BTC", they wouldn't have auctioned off the SR spoils.

There are too many people in the financial sector involved now to just sit and take it like that.
 
The BitLicense FUD has been somewhat relentless on the inside. If you want to go entirely by charting, something needs to happen soon or it may be a boring sideways period for another year yet like the period from 2011-2012.

Too much stuff in the pipeline for me to believe the latter...but anything is possible...

I wouldn't call 2011-2012 "sideways". It was the middle of a large bubble pop and took a while to make a new ATH, if that's what you mean, but I don't know if we can really compare those two because of the different circumstances now.

What kind of "stuff in the pipeline" do you see that makes you optimistic, if I may ask?
 
Could you clarify some of that? Is there a main bullet point to all of that?

That's open source python script which can be used to calculate Parabolic SAR at tradewave.net (or other python platform). The definition is reusable and, as sar requires multiple stored objects, the definition automatically builds non conflicting 24 character persistent stored object names using a custom hash function so that variable names do not overlap as you reuse the definition multilpe times in one script. So you could calculate a sar on 12h sticks and a sar on 1m sticks in the same script for example.


Also, if you read through the link in the thread... you'll find that my custom SAR is much more accurate and tunable than TA-lib. Here is an image of the TA-lib calculation failing:

98494cb4279bd1db6b1920f2a82bb05c08996b4502.png



SAR should fall until it intersects the high price... then stop and reverse... then rise until it intersects the low price... then stop and reverse. The TA-lib library used at sites like cryptotrader.org is a black box version of this calculation that lacks any sense of "support" and updates; there are bugs and it often fails. This, among other reasons, is why I've stopped development at cryptotrader and have been focusing on raw python bots at tradewave.net.


I brought this issue with TA-lib up on the forum at cryptotrader, I was told it was my own fault and I was using it wrong. Cryptotrader does not offer enough stored object space to recreate SAR in a custom sense. As I pushed the issue I was banned from the forum. Having brought the issue up at tradewave, the owner is looking to implement my SAR calculation in replacement of the default TA-lib calculation in coming weeks. It is apparent that my calculation is correct and TA-lib is simply wrong. If you had been trading on the TA-lib signal in the piture above you would have bought and sold randomly about a dozen times over 10 days due to the TA-lib bug I uncovered, when you should have only made 2 trades.


The next script which begins with:

def update_external_data():

is a new feature at tradewave. I'm now able to access data from external HTTP sources; like trollboxarchive.com. Essentially this gives me the means to analyze things like how often is someone saying "LTC to da moon" in the trollbox.

Another advantage, I can also build external API log files, then later retrieve data from those logs. So I can in effect build a low frequency bot which trades on 2h candles... while another bot that does nothing but "iceberg order" waits for an HTTP log file signal from that bot to begin ordering on 1m scale.


The other script regarding "fourier analysis":

http://en.wikipedia.org/wiki/Fourier_analysis

Fourier is a means to reconstruct a stochastic process using a sine wave function. You can then filter out high frequency elements of the wave to smooth the signal. The result is line similar to a MA; but one which projects into the future.

Another recent development I've released which projects into the future is polynomial regression:

https://discuss.tradewave.net/t/scipy-polynomial-regression/268
 
Last edited:
I wouldn't call 2011-2012 "sideways". It was the middle of a large bubble pop and took a while to make a new ATH, if that's what you mean, but I don't know if we can really compare those two because of the different circumstances now.

Yeah I wasn't very clear.

2011 BTC did the same slow tumble for many months. Then things sorta bounced around for a $12 average for most of 2012 if I recall. If that were to happen to BTC, then 2015 is when things would shoot again. Circumstances are way different now though, I agree.

What kind of "stuff in the pipeline" do you see that makes you optimistic, if I may ask?

Retailers, Paypal, ETF's, emerging markets.

The biggest enemy to BTC I think is the Bitcoin Foundation itself. They are being super quiet when things were supposed to have been announced by now. Bitpay & Coinbase are going right along with whatever they say since they sort of are a main part of the "Foundation" themselves.
 
pres,

Mind doing some charts again? Was 540 a fib line that might be a problem now that it was broken?
 
Back
Top