Jetpack Publicize: Setting the image and text shown on Twitter and Facebook

I use Jetpack Publicize to share my posts on Facebook and Twitter. This makes it easy to create a single WordPress post and have it show up on social media sites. Those services automatically use the Featured Image of the WordPress post, plus an excerpt of the title/text.

There’s another cool feature: Facebook and Twitter will include a photo and text summary for your site whenever someone types your URL into their post on Facebook and Twitter.

But… It can be tricky to figure out how Facebook/Twitter retrieve the image and the text. (I started this quest because Facebook was choosing the wrong image for the thumbnail. I found my first clue by reading WP Beginner’s article How to fix Facebook Incorrect Thumbnails.) This led to the debugging techniques below.

Both Facebook and Twitter use Open Graph metatags to find the desired image, text, title, etc.  Jetpack automatically sets these OpenGraph meta-tags: og:image tag, og:title, and og:description. It also automatically inserts tags for Twitter: twitter:text:title and twitter:description.

To see what those services will display, each has a Debugger/Validator. To use them, go to the page below and type in your URL.

Facebook Validator:
Twitter Card Validator:

Problem:  Jetpack automatically sets the OpenGraph tags, but sometimes it chooses values that are not useful, and they cannot be modified within Jetpack. (At least, Jetpack support could not tell me how to set them.)

Update: Using Jetpack 7.3.1 and Yoast 11.3, the conflict between these two plugins appears to have gone away. But this note still describes useful debugging techniques for Facebook and Twitter. (31May2019)

Solution – Part 1: The Yoast SEO plugin – I use the free version – also lets you set the Open Graph tags for both Facebook and Twitter. (In addition, I really like how it helps to optimize my text for search engines.) However, Yoast and Jetpack’s Open Graph tags interfere with each other.

Solution – Part 2: This is a little bit yucky, but you can disable Jetpack’s OpenGraph tags with a bit of code in your child theme’s functions.php file, as described in Remove Jetpack’s Open Graph meta tags This is no longer required – see Update above.

TL;DR: (Too Long; Didn’t Read) – now live!

I have reworked my blog so that the primary domain name is “Random Neurons Firing” (instead of the pedestrian Same content, but a better name.

I’m also adding a new topic to those I’ve previously covered (“Software, Networking, Life”). Over the last two years, I have gone to many planning and zoning conferences to learn more about how to provide attractive housing within communities. I’ll post my notes from those conferences and workshops here. I need to note that these will be my own opinions, and not those of any public boards to which I might belong.

Feel free to share this post on Facebook, LinkedIn, Twitter, or email by clicking one of the icons below. Any opinions expressed here are solely my own, and not those of any public bodies, such as the Lyme Planning Board or the Lyme Community Development Committee, where I am/have been a member. I would be very interested to hear your thoughts – you can reach me at

Spring 2019 Business Leaders Housing Breakfast

On May 3, 2019, I attended the Spring 2019 Business Leaders Housing Breakfast sponsored jointly by Vital Communities  and Twin Pines Housing. Twice a year, these groups bring together a group of speakers to address questions about housing. I have included links to the three presentations. Here are my takeaways:

Bennington Healthy Homes 

Kevin Dailey of Southwestern Vermont Health Care (SVHC) described how potential employees have trouble finding housing close to the hospital and consequently endure a long commute. SVHC established a program to acquire abandoned homes that are not economically viable as a commercial renovation. They then upgrade the homes so that there will be no major expense for 10 years, then pay the closing costs for an employee to purchase. They expect to spend about $25,000 per home. They have done four homes so far.

Woodstock Housing Initiative

Jill Davies spoke about how Woodstock Community Trust established a program to help moderate income people live in Woodstock, Vermont. They point out the need by asking these questions:

  • Who’s saving your life?
  • Who’s teaching your kids?
  • Who’s cooking your food?
  • Can they live in your community?
The Housing Initiative is about to sell their first home, working with Twin Pines Housing to help home buyers using a model developed on Martha’s Vineyard:
  • A buydown program, where the initiative puts up the down-payment, mortgage insurance, closing costs, to help a person who can afford the monthly payments, but doesn’t have the cash for up-front costs
  • They make structural repairs and fix major appliances, to avoid big bills in the first three years

Upper Valley Real Estate Update

In their semi-annual review of real estate housing trends, Buff McLaughry of Four Seasons Sotheby’s International Realty and Lynne LaBombard of LaBombard Peterson Real Estate LLC gave these highlights:
  • They consider the Upper Valley to have 69 towns, about 187,000 population, and 87,000 jobs.
  • Affordable housing is a core component of community health. If housing isn’t readily available, the community is suffers.
  • Rental property across the Upper Valley have less than 3% vacancy (very low), this is down about 10% from a year ago.
  • The commute to jobs has not changed much in the last year, but remains high at about 45 minutes.
  • Number of home sales has remained roughly constant over the last year, but inventory has changed:
    • Homes below $300K: inventory is strongly down – hardly any are available
    • Homes between $300K and $600K: down, less available
    • Homes above $600K: about the same, or slightly higher inventory

Feel free to share this post on Facebook, LinkedIn, Twitter, or email by clicking one of the icons below. Any opinions expressed here are solely my own, and not those of any public bodies, such as the Lyme Planning Board or the Lyme Community Development Committee, where I am/have been a member. I would be very interested to hear your thoughts – you can reach me at

2019 Homeownership Conference – NHHFA

On March 19, 2019, the New Hampshire Housing Finance Authority held its 2019 Homeownership Conference. Agenda is at:

How can we get more homes built? How can we keep pace with the demand?

Dean Christon – Executive Director of NHHFA pointed to several initiatives:

St. Anselms College Center for Ethics in Business & Governance

  • Need more funding
  • Need a state-wide housing board
  • Need a legislative study on housing density

There are three interesting bills in the legislature and their sponsors:
SB 306 – Housing Appeals Board – Giuda
SB 15 – Affordable Housing Fund – Bradley
SB 43 – Density Study – Fuller-Clark

Finally, he introduced the NHHFA 2019 Housing Market Report  Read it at

Improving Housing Affordability

Elliot Eisenberg (the “Bowtie Economist”, gave an entertaining and insightful review of housing issues. His major points:

  • The yield curve has inverted. This is a symptom of a non-zero chance of recession but at least 12 months out
  • There is a slowing economy, both in the US and the world. This is leading to slow growth
  • Regulatory costs add about $20,000 to $30,000 to the typical home. Mostly these are costs of complying with zoning
  • There are labor supply problems – builders cannot attract qualified builders, driving up labor costs
  • Land costs are high, making it more expensive to build
  • Builders have stopped building homes below $250,000
  • Rent growth has slowed recently
  • Municipalities should practice dynamic zoning. We regularly rezone commercial and industrial properties, but rarely do we allow changes to the density of residential districts
  • We must get housing density up, to decrease the cost of homes

He gave a similar presentation to the Greater Houston Builders Association Conference two weeks prior. That video is at:

New Hampshire’s Housing Needs

Russ Thibault, Applied Economic Research, gave a talk about the New Hampshire housing needs. High demand for rental housing – “If the paint is dry, it’s occupied.” There is a shortage of 15,000 to 20,000 units in NH, with a demand for about 3,500 new units per year.

However, new construction is lagging because of high prices for construction materials (especially lumber), high labor costs (2.5% unemployment makes it hard to find qualified carpenters), and high land prices mean that existing housing is a relative bargain, driving up those prices.

He also gave a lesson in the history of zoning: In the 1980s, there was a sense of unease that the entire state would become overrun because of development pressure from Massachusetts. Ordinances were put in place to create a regulatory framework to prevent towns (especially in the southern part of NH) from being overwhelmed. These laws succeeded, but are now too restrictive and keep towns and municipalities from developing reasonable housing alternatives.

Engaging Stakeholders to Find Solutions

A panel discussion with the following people:

Builder: KEVIN LACASSE, New England Family Housing
Realtor: MATT MERCIER, Jill & Co. Realty Group
Architect/Engineer: TIM NICHOLS, AECm
Banker:PETER RAYNO, Enterprise Bank
Local Government: STEPHEN BUCKLEY, NHMunicipal Association
Moderator: BEN FROST, New Hampshire Housing

I took a few notes:

  • Mixed use development is good: Single family or rental above office or retail on the first floor.
  • Modular/pre-engineered construction helps improve labor cost/schedule/quality, with less waste, environmental impact, and bigger (annual) energy savings.
  • Riggins Rules – a good framework for being an effective land use board.

Summary of the Day’s Discussion

Ignatius Maclellan, Managing Director, Homeownership Division of NHHFA offered these thoughts:

  • Fully fund the State of NH Sewer Expansion work.
  • There are planning grants that can mitigate developer risk
  • To get market-price homes, streamline approval processes

Feel free to share this post on Facebook, LinkedIn, Twitter, or email by clicking one of the icons below. Any opinions expressed here are solely my own, and not those of any public bodies, such as the Lyme Planning Board or the Lyme Community Development Committee, where I am/have been a member. I would be very interested to hear your thoughts – you can reach me at

No on Lyme’s Article #2

Background: In March 2019, the Planning Board proposed an amendment to the Lyme Zoning Ordinance to modify the current Lot Size Averaging provision. I sent the following notes to the Lyme Listserv to ask residents not to vote for this change.

Update: At Town Meeting, Article 2 (the Lot Size Averaging amendment) failed by a vote of 282 votes against, and 198 votes for it.

To the Lyme Listserv,

I am writing because a couple people asked my opinion of the Lot Size Averaging (LSA) amendment (Article #2) on Tuesday’s warrant.

Lot Size Averaging (LSA) could provide a useful tool but as amended, it dramatically decreases what is allowed in comparison with what an owner could build in a conventional subdivision. The details make for a long message, but in sum it permits much less in the way of house (about 50% less) and outbuildings (up to 80% less).

Furthermore, the “procedural simplicity” claimed for the new language comes by removing the right of an applicant to appeal to the zoning board for relief from the new restrictions.

Given these new constraints on what can be built, this proposed LSA amendment doesn’t provide value to Lyme, or to a resident who would consider using it.

That said, smaller, more dense (and less expensive) housing can help those who want to downsize and stay in town. It also helps the many people who serve the residents of Lyme in our schools, homes, restaurants, and who provide services such as hairdressers, firefighters, nurses, and electricians. They should be able to live in Lyme, too.

If you think that alternatives should be allowed in Lyme, please vote No on Article #2 on the paper ballot this Tuesday, and encourage our dedicated volunteer planners to return to the drawing board and consider other strategies for housing.

Rich Brown

In a followup email, Mary Callahan wrote:

Can anyone give me more information on this article? I’ve read the article on the town report but I’m wondering about the background. How does this differ with what’s in place now and what is the need that the board sought to address with this amendment? Maybe someone can point me to minutes on the town site, I haven’t been able to find them.

Thanks for your curiosity. As I noted in my previous message, Lyme has a complicated ordinance regarding how people can build on their land. Since our application for subdivision triggered this amendment, let me use our property to illustrate how the new language constrains the ability to build homes.

To summarize, our 96-acre parcel at the Loch Lyme Lodge yields only four buildable lots under the both the current and new language. However, the new language would limit our building footprint by more than 50%, the lot coverage by 80%, and the gross floor area by two thirds from what could currently be allowed on the parcel.

The Details

The Lyme Ordinance promotes single-family homes on large lots. A “buildable lot” must have at least three acres remaining after subtracting conservation overlays – wetlands and buffers, agricultural soil, steep slopes. Parent parcels frequently need to be twice or three times as large (six to ten or more acres) to meet that minimum lot size.

Under Current Ordinance:

Pinnacle Project owns a 96 acre parcel in the Rural district at the Loch Lyme Lodge that could be used for residential purposes. After subtracting conservation overlays and applying the three/five acre rules, the 96 acre parcel ends up with only four “buildable lots” of 3.0, 5.6, 6.8, and 8.6 acres, respectively. These lot sizes determine the Building Footprint, Lot Coverage, and Gross Floor Area for the homes.

Building Footprint – In a conventional subdivision of our land, the current ordinance would permit home footprints that are 2% of the lot size, or 2,622, 4,855, 5,949, and 7,000 square feet for a total of 20,426 sf.

Maximum Lot Coverage limits the total square footage of the home, garage, barn, workshop, greenhouses, sheds, outbuildings, etc. In a conventional four-lot subdivision the allowance is a generous 12% of the lot size, capped at 26,000 sf. Three of our lots would allow 26,000sf, the fourth would be 15,734 sf of lot coverage, for a total of 93,734sf.

Gross Floor Area in the residential district is capped at 14,000 sf. In practice, the height limitations mean that the homes likely would have two stories, and our buildings would max out at 5,244, 9,710, 11,898, and 14,000 sf, a total of 40,852sf.

I acknowledge that these would be large homes, but if someone wanted to build them, they would be permitted automatically (subject to suitable septic, driveway, etc. plans) under the current ordinance. I also note that the setback requirements and other dimensional controls would require that the homes be spread out across the property with no protection of open space.

Using the New Lot Size Averaging language:

If an applicant instead wanted to avoid breaking up the open space on a parcel and cluster the homes, the new ordinance applies severe constraints on these homes. Using the same lot sizes for our property:

Building Footprint: limited to 2,500sf per home. Four homes would total 10,000sf, or less than 50% of what is currently allowed.

Maximum Lot Coverage: limited to 4,500sf. Four homes would total 18,000sf, or only 20% of what would be allowed.

Gross Floor Area: limited to 3,000sf. Four homes would total 12,000sf, or less than a a third of what would be allowed currently.

Two More Points:

The current ordinance also limits building sizes for lot size averaging, but includes an option to go to the Zoning Board to request for relief to those constraints. The new language explicitly removes this option.

Finally, the building limits seem arbitrary. During both the drafting hearings and the public hearings, the board set the footprint and gross floor area sizes by their feelings that they were “reasonably sized” and “generous” homes according to their “perception of what’s reasonable.”

Summary: The proposed language is a disincentive for the use of Lot Size Averaging. In the only recent case, this new language would limit the footprint by more than 50%, the lot coverage by 80%, and the gross floor area by two thirds over what could currently be allowed.

This is not a useful change, and only continues the Ordinance’s promotion of single family homes on large lots.

If you agree, please vote No on Article #2 and encourage our dedicated volunteer planners to consider other strategies for housing.

Feel free to share this post on Facebook, LinkedIn, Twitter, or email by clicking one of the icons below. Any opinions expressed here are solely my own, and not those of any public bodies, such as the Lyme Planning Board or the Lyme Community Development Committee, where I am/have been a member. I would be very interested to hear your thoughts – you can reach me at

SQLite Date and Time Functions – explained

A while back, I was greatly confused by SQLite date and time functions. It took me a while to figure out what was wrong. (It was my error: I hadn’t observed the rule that dates must have this form “YYYY-MM-DD” – four digit year, two-digit month and day.)

Nevertheless, I found that the documentation wasn’t quite clear, so I wrote up these notes as an adjunct to SQLite Datatypes and the SQLite Date and Time Functions pages.

2.2. Date and Time Datatype

SQLite does not have a storage class set aside for storing dates and/or times.
The conventional way to store dates is as a string in a TEXT field.
These fields can be compared directly (as strings) to determine equality or order.

For other date-as-string formats, see Date Strings on the Date And Time Functions page.

For further manipulations on dates and times, the built-in Date And Time Functions of SQLite convert dates and times between TEXT, REAL, or INTEGER values:

  • TEXT as strings (“YYYY-MM-DD HH:MM:SS.SSS” – with leading zero where required, and four-digit year – a so-called “timestring”)
  • REAL as Julian day numbers, the number of days (with fractional part) since noon in Greenwich on November 24, 4714 B.C. according to the proleptic Gregorian calendar.
  • INTEGER as Unix Time, the number of seconds since 1970-01-01 00:00:00 UTC.

Applications can chose to store dates and times in any of these formats and freely convert between formats using the built-in date and time functions.

Date And Time Functions

The Date and Time Functions page doesn’t really define the the arguments or the return types, so I make them explicit below.

Timestring: The conventional way to store dates is as a timestring – a TEXT field (e.g., “YYYY-MM-DD HH:MM:SS.SSS”). These fields can be compared directly (as strings) to determine equality or order.

To convert to other date representations, SQLite supports five date and time functions. All take a timestring (a subset of IS0 8601 date and time formats, listed below) as an argument. The timestring is followed by zero or more modifiers. The strftime() function also takes a format string as its first argument.

  1. date(timestring, modifier, modifier, …) Returns the date as a string: “YYYY-MM-DD”.
  2. time(timestring, modifier, modifier, …) Returns the time as a string: “HH:MM:SS”.
  3. datetime(timestring, modifier, modifier, …) Returns a string: “YYYY-MM-DD HH:MM:SS”.
  4. julianday(timestring, modifier, modifier, …) Returns the Julian day as an REAL – the number of days (and fractional part) since noon in Greenwich on November 24, 4714 B.C. (Proleptic Gregorian calendar).
  5. strftime(format, timestring, modifier, modifier, …) Returns the date formatted according to the format string specified as the first argument. The format string supports the most common substitutions found in the strftime() function from the standard C library plus two new substitutions, %f and %J.

… see the original SQLite page for modifiers and legal timestring formats …


This section replicates the examples of the original page, but includes the results and types of the function.

Compute the current date. Returns timestring.

SELECT date('now');  -- Result: 2018-03-07

Compute the last day of the current month. Returns timestring.

SELECT date('now','start of month','+1 month','-1 day'); -- Result: 2018-03-31

Compute the date and time given a unix timestamp 1092941466. Returns timestring.

SELECT datetime(1092941466, 'unixepoch'); -- Result: 2004-08-19 18:51:06

Compute the date and time given a unix timestamp 1092941466, and compensate for your local timezone. Returns timestring.

SELECT datetime(1092941466, 'unixepoch', 'localtime'); -- Result: 2004-08-19 14:51:06

Compute the current unix timestamp. Returns INTEGER.

SELECT strftime('%s','now');  -- Result: 1520444198

Compute the number of days since the signing of the US Declaration of Independence. Returns REAL – days and fractions of a day.

SELECT julianday('now') - julianday('1776-07-04'); -- Result: 88269.7339379285

Compute the number of seconds since a particular moment in 2004: Returns INTEGER.

SELECT strftime('%s','now') - strftime('%s','2004-01-01 02:34:56'); -- Result: 447519729

Compute the date of the first Tuesday in October for the current year. Returns timestring.

SELECT date('now','start of year','+9 months','weekday 2'); -- Result: 2018-10-02

Compute the time since the unix epoch in seconds (like strftime(‘%s’,’now’) except includes fractional part). Returns REAL – days and fractions of a day.

SELECT (julianday('now') - 2440587.5)*86400.0; -- Result: 1520444280.01899

A Practical Tutorial for SQLite Date Functions

The SQLite document doesn’t really show how to use date functions in actual code. Here is an example of inserting and retrieving dates in a table.

Note: It is good practice to store dates as text in a datestring format – YYYY-DD-MM. The “BMW” entry below is inserted as an integer number of seconds, and doesn’t work right when trying to use the julianday() function

bash-3.2$ sqlite3
SQLite version 3.29.0 2019-07-10 17:32:03
Enter ".help" for usage hints.
Connected to a transient in-memory database.
Use ".open FILENAME" to reopen on a persistent database.
sqlite> create table car_table(car_name text, car_date text);
sqlite> insert into car_table values ("Ford", date('now'));
sqlite> insert into car_table values ("Toyota", date('now','7 days'));
sqlite> insert into car_table values ("BMW", strftime('%s','now'));
sqlite> select * from car_table;
BMW|1576431883       <-- uh oh, this is in seconds, not a datestring
sqlite> select car_name, julianday(car_date) from car_table;
BMW|                 <-- uh oh, this isn't a julianday, as expected
sqlite>              ^D to exit

Local by Flywheel won’t start because it’s regenerating Docker Machine TLS certificates

I have been using Local by Flywheel and really enjoying it. It does two things:

  1. You can stand up a development version of a WordPress site on your laptop and horse around with it. It’s fast, you can make experiments, and if it blows up, you can simply regenerate in a minute or two.
  2. Using the (paid) Flywheel hosting, you can transfer your local dev server to their public hosting, and you’re on the air.

I have not used this latter facility, but I’m here to tell you that the first part is pretty slick.

But… I went away from Local by Flywheel for a month or so, then came back to start working on a new site. When I wanted to start it up, I saw a succession of messages stating that it was “Regenerating Machine Certificates” and that “Local detected invalid Docker Machine TLS certificate sand is fixing them now.” This looped apparently forever, and wouldn’t work. Here’s my report on their community forum.

After considerable searching, I found a procedure from one of the developers that seems to do the trick. It involves downloading a new version of the Boot2Docker ISO file, and letting the system re-provision itself. The process involved a) Creating an alias (“local-docker-machine”) for the “Local by Flywheel”s docker-machine binary; b) issuing a series of commands to that alias:

local-docker-machine stop local-by-flywheel
rm -rf ~/.docker/machine/certs
local-docker-machine create local-cert-gen
local-docker-machine start local-by-flywheel
local-docker-machine regenerate-certs -f local-by-flywheel
local-docker-machine rm -f local-cert-gen

These steps caused Local by Flywheel to recognize that the Boot2Docker ISO was out of date. It triggered a download of the new version, and gave the output below. When it completed Local by Flywheel worked as expected. Whew!

bash-3.2$ alias local-docker-machine="/Applications/Local\ by\"
bash-3.2$ local-docker-machine stop local-by-flywheel; rm -rf ~/.docker/machine/certs; local-docker-machine create local-cert-gen; local-docker-machine start local-by-flywheel; local-docker-machine regenerate-certs -f local-by-flywheel; local-docker-machine rm -f local-cert-gen;
Stopping "local-by-flywheel"...
Machine "local-by-flywheel" is already stopped.
Creating CA: /Users/richb/.docker/machine/certs/ca.pem
Creating client certificate: /Users/richb/.docker/machine/certs/cert.pem
Running pre-create checks...
(local-cert-gen) Default Boot2Docker ISO is out-of-date, downloading the latest release...
(local-cert-gen) Latest release for is v18.09.1
(local-cert-gen) Downloading /Users/richb/.docker/machine/cache/boot2docker.iso from
(local-cert-gen) 0%....10%....20%....30%....40%....50%....60%....70%....80%....90%....100%
Creating machine...
(local-cert-gen) Copying /Users/richb/.docker/machine/cache/boot2docker.iso to /Users/richb/.docker/machine/machines/local-cert-gen/boot2docker.iso...
(local-cert-gen) Creating VirtualBox VM...
(local-cert-gen) Creating SSH key...
(local-cert-gen) Starting the VM...
(local-cert-gen) Check network to re-create if needed...
(local-cert-gen) Waiting for an IP...
Waiting for machine to be running, this may take a few minutes...
Detecting operating system of created instance...
Waiting for SSH to be available...
Detecting the provisioner...
Provisioning with boot2docker...
Copying certs to the local machine directory...
Copying certs to the remote machine...
Setting Docker configuration on the remote daemon...
Checking connection to Docker...
Docker is up and running!
To see how to connect your Docker Client to the Docker Engine running on this virtual machine, run: /Applications/Local by env local-cert-gen
Starting "local-by-flywheel"...
(local-by-flywheel) Check network to re-create if needed...
(local-by-flywheel) Waiting for an IP...
Machine "local-by-flywheel" was started.
Waiting for SSH to be available...
Detecting the provisioner...
Started machines may have new IP addresses. You may need to re-run the `docker-machine env` command.
Regenerating TLS certificates
Waiting for SSH to be available...
Detecting the provisioner...
Copying certs to the local machine directory...
Copying certs to the remote machine...
Setting Docker configuration on the remote daemon...
About to remove local-cert-gen
WARNING: This action will delete both local reference and remote instance.
Successfully removed local-cert-gen

Internet Identity, Nationwide Bank, and the Post Office

Dave Winer wrote about “internet identity” and that several companies were probably thinking about solving the problem. Specifically, he said:

But because money is so central to identity, it’s surprising that there isn’t a Google or Amazon of identity. Seems there’s money to be made here. An organization with physical branches everywhere, with people in them who can help with indentity (sic) problems.

This reminded me of the proposal to have US Post Offices become banks (for example, here and a zillion other places.)

The advantages:

  • There are post offices everywhere. The postal system is constitutionally mandated to be present, so it’s useful for them to have a valuable mission even as the volume of paper mail declines.
  • The “Bank of the US Post Office” could provide an ATM at each branch. You could withdraw cash without fees anywhere in the US.
  • They could provide a low cost (no cost?) saving/checking accounts for the traditionally “unbanked”, instead making people use check cashing services, payday lenders, etc. who siphon off a percentage of the transaction.
  • Postal employees have a strong ethos of caring for the transactions, and already have procedures for handling cash.
  • Post Offices are accustomed to handling critical, private matters in a timely way.

Identity management seems another valuable service that the USPS might provide.

Linking Reservation Nexus and TripAdvisor

Connecting ResNexus and TripAdvisor

We wanted our room availability to show up in TripAdvisor and other online services. There are two basic steps, where you tell Reservation Nexus and TripAdvisor how to find each other’s information:

  • Use Reservation Nexus Availability Exchange to share your room availability
  • Use TripAdvisor TripConnect to link up your business to the Reservation Nexus listings

Note: The business name, postal address, URL, and email must be exactly the same in both ResNexus and TriPAdvisor. Check them before starting this procedure:

On the Reservation Nexus site:

  1. In the ResNexus Settings choose Availability Exchange, near the bottom of the settings (first image below).
  2. In the Availability Exchange page:
    • Click the REGISTER button to register your rooms
    • Click Only share my availability… and check off the desired services. (second image)
  3. Click SAVE. The resulting page (third image below) shows:
    • Your Availability Exchange ID next to the UNREGISTER button
    • The Last full synch time

On your TripAdvisor site:

  1. Log into TripAdvisor
  2. Go to and click Check your Eligibility. It will show a page naming your property to link to the Cost per Click program. (first image below)
  3. Click Get Connected. You will see a page listing the choices. (second image)
  4. Find “Reservation Nexus” and click it to select, it, then click Confirm. (third image)
  5. The confirmation page (fourth image below) should show property prices for a specific night. This confirms that the connection has been established. Continue with the cost-per-click process with TripConnect.
  6. If you see an error (fifth image), ensure that your contact information for Reservation Nexus and TripAdvisor are exactly the same.


  • When it works, the connection between Reservation Nexus and TripAdvisor should happen almost immediately, and you should see the confirmation page listing your property prices.
  • If you had to modify your ResNexus info, then you may need to contact ResNexus to have them re-publish your TripConnect info.
  • Contact Reservation Nexus if the connection has not completed within an hour.

Taxpayer-Funded Networks – all that bad?

I saw an article fretting about taxpayer-funded broadband projects in Texas Monitor. It cites a “study” by the Taxpayer Protection Alliance Foundation that purports to show a wide swath of “failed taxpayer-funded networks”.

A little research on the site led me to realize that it’s not first-rate work – outdated, incorrect information – so I left the following comment on the Texas Monitor site:

I decided to check the “Broadband Boondoggles” site to see what information they provide. First off, the copyright date on the site’s footer says 2017 – are they even updating it?

More specifically, I found that they disparage the local project (in VT) of which I have personal knowledge. They state that as of January 2015 ECFiber has spent $9M to connect 1,200 subscribers (“an astounding $7,500 per customer.”)

Well, that may be true – as of that date. If they had bothered to follow up with ECFiber’s progress ( they would have learned:

  • As of January 2018 they have connected over 2000 customers (cost per subscriber is now roughly half that reported number)
  • They’re hampered by the pole “make ready” process by the incumbent monopoly carriers who are slow to respond. They could connect subscribers faster if the carriers would follow their legal make-ready obligations.
  • ECFiber is a private community effort, entirely funded with grants and private equity/loans, so I’m curious how they could even have filed a FOIA request.
  • They’ve now raised $23M capital (from the private markets), to reach 20,000 subscribers.
  • This gives a system-wide average cost of $1,150/subscriber – a very attractive cost.

I’m sure there are false starts and overruns for many municipal projects, but if this outdated information is typical of the remainder of the TPAF site, then I would be reluctant to accept any of its conclusions without doing my own research.