Firstly, I would recommend trying the External Tables approach that Gary recommends - there is a good chance that this will meet your requirement, and will let you treat the data in the file as rows from a table in your calling code.
My next suggestion would be writing a function that encapsulates the UTL_FILE access, and returns the lines using PIPE ROW (that way you do not need to read the whole file into memory before the code starts returning rows to the client).
This function could either return a REF CURSOR (which JDBC can certainly handle) - basically a handle to the result set - or alternatively you could return a collection of object types.
CREATE OR REPLACE myType
AS OBJECT
(user defined structure or something as simple as textline VARCHAR2(2000));
CREATE OR REPLACE myTypeArray
AS TABLE OF myType;
CREATE OR REPLACE
FUNCTION myFunction(pFile IN VARCHAR2)
RETURN myTypeArray PIPELINED
AS
lvRow myType;
BEGIN
UTL File code to open file and iterate over contents populating lvRow
PIPE ROW(lvRow);
end loop
close file, etc
END myFunction;
Then access this by casting the results of the function in a select
SELECT * FROM TABLE(myFunction(:filename))
If you don't have any parameters to the function, or can derive them from a table, you could even wrap this in a view.
The REF CURSOR approach pretty much wraps this SQL up in a function
FUNCTION myFunction(pFilename)
RETURN sys_refcursor
IS
lCursor sys_refcursor;
BEGIN
OPEN lCursor FOR
SELECT textline
FROM TABLE(CAST(myFileExtractFunction(pFilename) AS myTypeArray);
RETURN lCursor;
END myFunction;
But as all of this is essentially re-inventing external tables, double-check that they do not meet your requirement.