I have a table that looks like this:
episodes
------------------------------------------------------------
id (PK serial) | show_id (int4) | episode_number (int2[])
------------------------------------------------------------
1 | 1 | {1}
2 | 1 | {2}
...
I'm working with some code. There are several queries whose effect is, if the row exists with some of the data filled in, then that row is updated with the rest of the data, and if the row does not exist, a new one is created. They look like this:
INSERT INTO table_name (col1, col2, col3)
SELECT %s AS COL1, %s AS COL2, %s AS COL3
FROM ...
I know of two ways to insert without duplication. The first is using a WHERE NOT EXISTS clause:
INSERT INTO table_name (col1, col2, col3)
SELECT %s, %s, %s
WHERE NOT EXISTS (
SELECT * FROM table_name AS T
WHERE T.col1 = %s
AND T.col2 = %s)
the other is doing a LEFT JOIN:
INSERT INTO table_name (col1, col2, col3)
SELECT ...
I have a situation where I want to insert a row if it doesn't exist, and to not insert it if it already does. I tried creating sql queries that prevented this from happening (see here), but I was told a solution is to create constraints and catch the exception when they're violated.
I have constraints in place already. My question is - ...
I'd like to write stored procedures in pgSQL that dynamically generate web-ready data. I need a pure SQL to HTML or SQL to XML gateway. Oracle has OWA. In Oracle you can setup a RAC frontend to a SAN and connect a large set of OWA hosts to your RAC so you
layer your web requests and spread your queries.
What is the PostgresSQL or MyS...
I am using the following code in my controller:
@monday = (Time.now).at_beginning_of_week
@friday = 5.days.since(@monday)-1.second
@sent_emails = ContactEmail.all(:conditions => ['date_sent >= ? and date_sent <= ?', @monday, @friday])
Although it works fine on my local sqlite, I have an "operator does not exist timestamp witho...
I have a table with 3 columns. customer_name varchar, account_typevarchar , current_balance_amount double precision.
this table having the following records.
current_balance
1200
1500.5
1500
If I execute the select query the above records are displayed. I want the current_balance amount as like the following formate.
current_bala...
Hey :)
I'm currently setting up a new users data model. Are the ids from facebook, twitter and openid all numerical? What is the length?
This is what i have so far:
Thanks for any collection.
Oliver
...
Suppose I have two queries on a database table.
The queries are defined in terms of fields used in the query:
Query1: depends on f1, f2, and f3
Query2: depends on f1, f2, f3 and f4
I remember reading somewhere that the SQL query engine (mySQL in this case) parses the index tree starting from the leftmost fields in the index.
If that...
Folks,
I am running given below query in two different server which has different versions of postgresql which gives strange results.
select distinct
"D","E","A","B","F","C","G","H","I","J","K","L"
from ABC
where "L"=1
group by "D","E","A","B","F","C","G","H","I","L"
order by "A", "B", "C";
Server1: db details->PostgreSQL 8.3.9 on...
As we know, Postgresql's OFFSET requires that it scan through all the rows up until the point it gets to where you requested, which makes it kind of useless for pagination through huge result sets, getting slower and slower as the OFFSET goes up.
PG 8.4 now supports window functions. Instead of:
SELECT * FROM table ORDER BY somecol L...
I have a PL/pgsql function like so
CREATE OR REPLACE FUNCTION foo(colname TEXT, col INT)
RETURNS REAL AS $$
BEGIN
IF (colname = 'a') THEN
RETURN (col * 1.5);
ELSIF (colname = 'b') THEN
RETURN (col * 2.5);
ELSIF (colname = 'c') THEN
RETURN (col * 3.5);
.. and so...
I have a column type date (shown from annotate) on my Contacts table:
# date_entered :date(255)
This is the line of code that has worked for me locally on my sqlite3 database, but now generates an error in Heroku:
<%= contact.date_entered.to_s(:long) %>
The error that I get is:
wrong number of arguments (1 for 0)
I remove...
Hi,
I have a table called _sample_table_delme_data_files which contains some duplicates. I want to copy its records, without duplicates, into data_files:
INSERT INTO data_files (SELECT distinct * FROM _sample_table_delme_data_files);
ERROR: could not identify an ordering operator for type box3d
HINT: Use an explicit ordering operator...
Hi.
I have three tables:
player [id, name]
attribute [id, name]
player_attribute [id, player_id, attribute_id, value]
each player can have different attributes, some of them don't have any. Now I need to search for players with certain attributes, e.g. all players that have number 11, and their first name is John. At this moment I al...
I have function
CREATE OR REPLACE FUNCTION "public"."GetSoilCod" (
"cod1" integer = 0,
"cod2" integer = 0,
"cod3" integer = 0,
"cod4" integer = 0,
"cod5" integer = 0,
"cod6" integer = 0,
"cod7" integer = 0
)
RETURNS varchar AS
$body$
declare result varchar;
BEGIN
result = cast($1 as varchar(2)) || '.' ||
...
Database has table X and tables An, Bn, Cn, Dn that inherits from X.
Process 1 queries periodically data from X.
Process 2 updates data in child tables. For example, to update tables An and Bn it creates new tables Am and Bm, loads data into them, locks in access exclusive An, Bn, drops An and Bn and alters Am and Bm to inherit X.
The...
Trying to do this:
SELECT CASE WHEN field = true THEN one * another ELSE one END as case_field
FROM table WHERE case_field >= 9000
and receive an error that case_field doesn't exist.
Is it possible to do this without duplicating CASE ?
...
I realize that, per Pg docs (http://www.postgresql.org/about/), one can store an unlimited number of rows in a table. However, what is the "rule of thumb" for usable number of rows, if any?
Background: I want to store daily readings for a couple of decades for 13 million cells. That works out to 13 M * (366|365) * 20 ~ 9.5e10, or 95 B r...
I'm running a bunch of queries using Python and psycopg2. I create one large temporary table w/ about 2 million rows, then I get 1000 rows at a time from it by using cur.fetchmany(1000) and run more extensive queries involving those rows. The extensive queries are self-sufficient, though - once they are done, I don't need their results a...